Information Theory

Overview


Conditional Entropy


{% H(Y|X) = -\sum p(x) \times H(Y|X=x) %}
{% = -\sum p(x,y) \times log(y|x) %}
{% = - \mathbb{E}[log p(Y|X)] %}

Contents