Information Theory

Overview

The theory of information takes several different forms. This corner focuses on the theory of information initially developed by Claude Shannon.

Topics

  • Entropy - the central notion in information theory, it is a measure of uncertainty or information
    • Joint Entropy - the amount of entropy contained in two random variables jointly
    • Conditional Entropy - is a measure of the additional amount of uncertainty that one variable contributes to the join entropy.
    • Entropy Rate - the entropy of a stochastic process
  • Mutual Information

Information Theory, Statistics and Artificial Intelligence

  • Entropy Optimization is a principle which has broad applicability in statistics and machine learning.