Joint Entropy

Overview


The joint entropy is the extension of the concept of entropy to multiple variables. It is defined by the following formula
{% H(X,Y) = - \sum p(x,y) \times log [p(x,y)] %}

Chain Rule


{% H(X,Y) = H(X) + H(Y|X) %}