Weighted Least Squares Regression

overview


In ordinary least squares regression, the regression will minimize the the mean square error given as
{% MSE = \frac{1}{n} \sum (Y_i - X_i \beta )^2 %}
In a weighted least squares regression, each point is given a weight {% w_i %}.
{% MSE = \frac{1}{n} \sum w_i (Y_i - X_i \beta )^2 %}
This can be re-written to as
{% MSE = \frac{1}{n} \sum (\sqrt{w_1}Y_i - \sqrt{w_1}X_i \beta )^2 %}
This equation shows how to compute the answer. Multiply the x values and y values by the square root of the corresponding weight, and then run a normal least squares regression.

Algorithm


The implementation of weighted least squares is a simple use of the map function. (see filtering and transformations)



let data2 = data.map((p,i)=>{
  return {
    x1: Math.sqrt(wi) * p.x1,
    x2: Math.sqrt(wi) * p.x2,
    y:Math.sqrt(wi)*y
  }
})
					

Locally Weighted Least Squares


A locally weighed least squares regression is a method to create a forecast function from a set of data points where every point is forecast by a separate weighted regression. The weighted regression is a regression over the points in the data set, where each point recieves a weight that is considered to be the distance from that point to the function input point.

That is, we start with a distance function (or metric) that specifies the distance between points. Then, to calculate f(x), run a weighed regression over the points {% x_i %} in the data set where the assigned weight {% w_i %} is given by {% w_i = d(x, x_i) %}.

Contents