Javascript Implementation of Backpropagation

Overview


Function


The test will try to approximate the function
{% f(\vec{x}) = 0.5\times \vec{x}_1^2 + \vec{x}_2 %}
The network we create will be a simple 1 layer network with the ReLu activation function.



let layers = [];
layers.push(ba.affine([[1,1]],  [[1]]));
layers.push(ba.relu());
					

Test Dataset


A test dataset is created, which consists of a series of inputs. The outputs are created as just a map between inputs and the computed outputs.



let inputs = [];
inputs.push([[1],[4]]);
inputs.push([[2],[2]]);
inputs.push([[1],[3]]);
inputs.push([[3],[4]]);
inputs.push([[4],[1]]);
inputs.push([[1.5],[1.5]]);
inputs.push([[2.5],[4]]);

          
let f = function(vector){
    let val = 0.5*vector[0][0]*vector[0][0] + vector[1][0];
    return [[val]]
}

let outputs = inputs.map(p=>f(p));
					

Training


In this example, we use a variant of stochastic gradient descent to train the network. For this example, a constant step size is used.


for(let i=0;i<100;i++){
    for(let j=0;j<inputs.length;j++){
        let input = inputs[j];
        ba.backpropagate(layers, input, outputs[j], ba.meanSquaredError());
        let testError2 = ba.error(layers, inputs, outputs, ba.meanSquaredError()); 
    }     
}
					
Try it!