Entropy

cosmos 14th March 2019 at 7:57pm
Information measures Statistical physics

The entropy, H(X)H(X), of a Random variable, XX, with Probability distribution pp is defined as

H(X)=xp(x)logp(x)H(X) = - \sum_x p(x) \log{p(x)}

Intuitively, entropy is the number of yes/no questions you expect you need to ask to identify the state of the world, under a Model of the world (Probability distribution over states of the world). I.e. how ignorant I think I am about the world.

video

Entropy and Kolmogorov complexity: thesis

Concavity of entropy