Shannon’s definition of information links probability to information through the formula:
$$I = \log_2(\frac{1}{p})$$
Compact codes such as the Morse code tend to associate short codes to frequent items. Ideally, based on Shannon’s definition, code length \(L\) should be related to frequency \(F\) through: \(L = \log_2(1/F)\). Any code provides an upper bound of Kolmogorov complexity \(K\). So in the absence of a better estimate, frequency provides a proxy to \(K\):
$$K \approx \log_2(1/F)$$