Overview
Forward Pass
The forward pass takes an input and evaluates the output of the neural network. The following function interates through each layer of the network, applies the layer to the relevant inputs, and then calculates the output of the next layer. As it does this, it collects the results, as well as computing the value of each layers gradients.
export function evaluateStack(layers, input){
let result = input;
let stack = [];
for(let i=0;i<layers.length;i++){
let layer = layers[i];
let item = {
input:result,
inputGradient:layer.inputGradient(result),
layer:layer
//parameterGradient:layer.parameterGradient(result)
};
if(layer.parameterGradient !== undefined) item.parameterGradient = layer.parameterGradient(result);
stack.push(item);
result = layer.evaluate(result);
}
stack.push({
input:result
});
return stack;
}
Backward Pass
{% \frac{d f(g(h(x)))}{dx} = \frac{dh}{dx} \frac{dg}{dh} \frac{df} {dg} %}
Notice the ordering of the factors in the chain rule. Because we are using
denominator layout, the chain rule has the factors reversed from what is normally displayed.
The following code takes an error function, and then backpropagates the inputs gradients back to each layer. Then it calculates that layers parameter gradient, and asks the layer to update its parameters based on the parameter gradient.
export function backpropagate(layers, input, output, error,step=0.01){
let stack = evaluateStack(layers, input);
stack[stack.length-1].inputGradient = error.inputGradient(stack[stack.length-1].input, output);
let gradient = stack[stack.length-1].inputGradient;
for(let i=stack.length-2;i>=0;i--){
let layer = stack[i];
if(layer.parameterGradient !== undefined){
layer.layer.update(la.multiply(layer.parameterGradient, gradient));
}
if(layer.inputGradient !== undefined) gradient = la.multiply(layer.inputGradient, gradient);
}
}