Mutual information (the difference between the entropy and the conditional entropy. I.e the decrease in uncertainty on a random variable when you learn about another random variable. I.e. the information you gain on a random variable from another RV) Measure of dependence.
Relative entropy. Mututal information is a special case. Defines a measure of "distance" between probabiliy distributions. Applications in estimating hypothesis testing errors and in large deviation theory.