In Information theory, the joint entropy of a pair of Random variable XXX and YYY is defined as:
H(X,Y)=∑x,yp(x,y)logp(x,y)H(X,Y) = \sum_{x,y} p(x,y) \log{p(x,y)}H(X,Y)=∑x,yp(x,y)logp(x,y)
Joint entropy