Overview
For a given time series
{% Y_1,Y_2,...,Y_n %}
{% Y_t = \alpha + \beta^T X_t + \epsilon_t %}
The errors are calculated as
{% \epsilon_i = X_i - \alpha %}
{% \epsilon_t \sim N(0,\sigma_t^2) %}
Then, a GARCH model specifies
{% \sigma_t^2 = \omega + \sum_{i=1}^q \alpha_i\epsilon_{t-i}^2 + \sum_{j=1}^p \beta _j \sigma_{t-j}^2 %}