In Information theory, the mutual information between Random variables , and is defined as:
where denotes expectation. The mutual information measures the amount of information we obtain about by knowing (see result below).
The mutual information between a random variable and itself is equal to its entropy
Some results (video):
is the Conditional entropy and thus gives you the information about that doesn't give you.
estimating mutual info with neural nets: https://arxiv.org/abs/1801.04062