Method of Moments

Overview


The method of moments is a way to fit a distribution to a sample dataset sampled from that distribution. In, effect, the analyst computes a set of moments (average, variance) from the sample dataset, and then sets the parameters of the distribution such that the resulting dsitribution has the same moments as the computed sample values.

Normal Example


When fitting a normal distribution to a set of sampled data points, the analysts computes the average of the points and the variance, and then chooses {% \mu = average %} and {% \sigma^2 = variance %}.

Contents