Entropy rate

cosmos 22nd March 2019 at 11:51pm

The entropy rate of an information source (see Data transmission) is the average entropy of a letter of the source.

An Information source is often modelled as a Discrete-time stochastic process {Xk}\{X_k\}, where each XkX_k is called a "letter". The entropy rate is then defined as:

HX=limn1nH(X1,X2,,Xn)H_X = \lim_{n\rightarrow \infty} \frac{1}{n} H(X_1, X_2, \cdots, X_n)

when the limit exits (see also Shannon-McMillan-Breiman theorem).

Chapter 2 Information Measures - Section 2.10 Entropy Rate of a Stationary Source

One can define a related measure, HXH_X, by using conditional entropies. It can be shown that, for an stationary Information source, the entropy rate exists and is equal to HXH_X.

https://www.wikiwand.com/en/Entropy_rate

Entropy rate for i.i.d. process

See Asymptotic equipartition property

Entropy rate for Stationary process

The asymptotic equipartition property in Chapter 3 establishes that nH (X) bits suffice on the average to describe n independent and iden- tically distributed random variables. But what if the random variables are dependent? In particular, what if the random variables form a sta- tionary process? We will show, just as in the i.i.d. case, that the entropy H (X 1 , X 2 , . . . , X n ) grows (asymptotically) linearly with n at a rate H ( X ), which we will call the entropy rate of the process. The interpretation of H ( X ) as the best achievable data compression will await the analysis in Chapter 5 (from Cover&Thomas book). See Source coding theorem

Entropy rate of a finite state process

Relations with Kolmogorov complexity

A note on Kolmogorov complexity and entropy, for stationary ergodic sources. See also Elements of Information theory by Thomas and Clover