The entropy rate of an information source (see Data transmission) is the average entropy of a letter of the source.
An Information source is often modelled as a Discrete-time stochastic process , where each is called a "letter". The entropy rate is then defined as:
when the limit exits (see also Shannon-McMillan-Breiman theorem).
Chapter 2 Information Measures - Section 2.10 Entropy Rate of a Stationary Source
One can define a related measure, , by using conditional entropies. It can be shown that, for an stationary Information source, the entropy rate exists and is equal to .
https://www.wikiwand.com/en/Entropy_rate
See Asymptotic equipartition property
The asymptotic equipartition property in Chapter 3 establishes that nH (X) bits suffice on the average to describe n independent and iden- tically distributed random variables. But what if the random variables are dependent? In particular, what if the random variables form a sta- tionary process? We will show, just as in the i.i.d. case, that the entropy H (X 1 , X 2 , . . . , X n ) grows (asymptotically) linearly with n at a rate H ( X ), which we will call the entropy rate of the process. The interpretation of H ( X ) as the best achievable data compression will await the analysis in Chapter 5 (from Cover&Thomas book). See Source coding theorem
Entropy rate of a finite state process
A note on Kolmogorov complexity and entropy, for stationary ergodic sources. See also Elements of Information theory by Thomas and Clover