Joint entropy

guillefix 4th November 2016 at 2:43pm

In Information theory, the joint entropy of a pair of Random variable XX and YY is defined as:

H(X,Y)=x,yp(x,y)logp(x,y)H(X,Y) = \sum_{x,y} p(x,y) \log{p(x,y)}

Joint entropy