Fano's inequality

cosmos 15th October 2017 at 7:48pm

Fano's inequality

Given two random variables XX and YY which are correlated, Fano's inequality quantifies the probability of error when using YY to estimate XX, by relating it to the conditional entropy H(XY)H(X|Y). Call the probability of error PeP_e, then

PeH(XY)1logχP_e \geq \frac{H(X|Y)-1}{\log{|\chi|}}

where χ\chi is the set over which XX takes values.

One can also prove using Jensen's inequality that for two i.i.d. random variables XX and XX':

P(X=X)2H(X)P(X=X')\geq2^{-H(X)}

with equality iff X is uniformly distributed.