Conditional Entropy

Overview


The conditional entropy is measure of the entropy of one variable, given knowledge of the other variable. This is shown in the chain rule, which says that the entropy of two variables is the entropy of one, plus the conditional entropy of the other.

Definition


The conditional entropy is defined by the following formula:
{% H(Y|X) = -\sum p(x) \times H(Y|X=x) %}
{% = -\sum p(x,y) \times log(y|x) %}
{% = - \mathbb{E}[log \; p(Y|X)] %}