Mutual Information

Overview


The mutual information, denote by {% I(X, Y) %} is a measure of how much information one random variable has of another. From an information perspective, the information that {% X %} has of {% Y %} indicates how much reduction of the entropy of {% Y %} that occurs by knowing the value of {% X %}

Definition


{% I(X,Y) = \sum p(x,y) \times log \frac{p(x,y)}{p(x) p(y)} %}

Relationships


  • {% I(X,Y) = H(X) - H(X|Y) = H(Y) - H(Y|X) %}
  • {% I(X,Y) = H(X) + H(Y) - H(X,Y) %}
  • {% I(X,X) = H(X) %}