Conditional entropy

guillefix 4th November 2016 at 2:43pm

In Information theory, the conditional entropy of a Random variable YY, conditioning on another random variable XX, is the average entropy of a random variable conditional on another random variable

H(YX)=x,yp(x,y)logp(yx)H(Y|X) = \sum_{x,y} p(x,y) \log{p(y|x)}

Conditional entropy video

Some results:

H(X,Y)=H(X)+H(YX)=H(Y)+H(XY)H(X,Y) = H(X) + H(Y|X) = H(Y) + H(X|Y)

Where we use the Entropy and Joint entropy of the random variables.

Proof