Overview
Given two random variables, {% X %} and {% Y %}, the sum is a new random variable {% Z %} such that given an element of the sample space, {% \omega %}, we have
{% Z(\omega) = X(\omega) + Y(\omega) %}
Discrete Probability
When the variables {% X %} and {% Y %} are discrete integers, the probability of the sum can be computed as
{% p_z(z) = \sum_{- \infty} ^{\infty} p(x, z-x) %}
where {% p(x,y) %} is the joint probability of {% X %} and {% Y %}.
Continuous Probability
When {% X %} and {% Y %} are continous, with a joint probability given by the density function {% f(x,y) %}, the cumulative function for {% z %} can be computed as
{% F_z(z) = \int_{- \infty} ^{\infty} \int_{- \infty} ^{z} f(x, v-x) dv dx %}
Interchanging the order of integration
{% F_z(z) = \int_{- \infty} ^{z} \int_{- \infty} ^{\infty} f(x, v-x) dx dv %}
Then the
density function
for {% z %} can be computed by taking the derivative
{% f_z(z) = \int_{- \infty} ^{\infty} f(x, z-x) dx %}
The integral given is known as the
convolution.