Coding theorem method

cosmos 17th January 2018 at 5:49pm

See MMathPhys oral presentation, Algorithmic information theory

Using the coding theorem to estimate the Kolmogorov complexity of short strings. The estimate is defined as:

Km(s)=log2(D(n,k)(s))K_m (s) = -\log_2(D(n,k)(s))

where

D(n,k)(s):={T(n,k):T produces s}{T(n,k):T halts}D(n,k)(s) : = \frac{|\{T \in (n,k) : T \text{ produces } s\}|}{|\{T \in (n,k) : T \text{ halts}\}|}

where (n,k)(n,k) is the set of Turing machines with nn states and kk letters in the alphabet of the input tape. The Turing machines are fed a blank tape, and whether the program halts is determined using a Busy beaver function.

An extension to NN-dimensional arrays has been developed using the Block decomposition method

Hector Zenil

New paper investigating its compression properties: https://www.hindawi.com/journals/complexity/2017/7208216/ – better than ohter methods, but I think not much..


See this paper and this one For some reason this seems to be a popular idea in Psychology.

Using these methods the people at Algorithmic nature group made The Online Algorithmic Complexity Calculator

More: Numerical evaluation of algorithmic complexity for short strings: A glance into the innermost structure of randomness

Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines