Overview
The covariance is an extension of the concept of variance to two variables. It is defined as
{% Cov(X,Y) = \mathbb{E}[(X - \mu_x)(Y - \mu_y)] %}
The covariance is a simple measure of the dependence of two random variables. It is related to the
concept of
correlation
and meant to measure the degree in which two random variables move together. It can be also be thought as
simple measure of the information that one variable contains in the another variable.
The covariance (and correlation) works best when the variables are Gaussian (normal). However, it is possible that for two random variables to have an exact functional relationship and yet still have zero covariance.
Linearity
Covariance is a linear function
{% Cov(X, aY + bZ) = aCov(X,Y) + bCov(X,Z) %}
Given the sum of a set of random variables
{% w_1 X_1 + w_2 X_2 + ... + w_n X_n %}
and given a matrix representing the covariances of the random variables
{% \sum_{i,j} = Cov(X_i,X_j) %}
then we have
{% Variance (w_1 X_1 + w_2 X_2 + ... + w_n X_n) = \vec{w}^T \sum \vec{w} %}