Markov Chains
Overview
A markov chain is a model where the state of the model in one period is dependent only on the immediately prior period and not the entire history of
model.
States of the model are usually indexed by an integer, such as the following
{% x_0, x_1, ... , x_n %}
The markov condition can stated as
{% P(x_n | x_{n-1}, x_{n-2} ... ) = P(x_n|x_{n-1}) %}
The probability of a transition from state i to state j is denoted {% P_{ij} %}. Then, we have
{% \sum_j P_{ij} = 1 %}
Transition Matrices
Probabilities are then usually arranged as a
matrix such as
{%
\begin{bmatrix}
P_{0,0} & P_{0,1} & P_{0,2} \\
P_{1,0} & P_{1,1} & P_{1,2}\\
P_{2,0} & P_{2,1} & P_{2,2} \\
\end{bmatrix}
%}
Then the sum of entries across a row is equal to 1.
Topics