In Information theory, the conditional entropy of a Random variable , conditioning on another random variable , is the average entropy of a random variable conditional on another random variable
Some results:
Where we use the Entropy and Joint entropy of the random variables.