Sequential Model Layers
Overview
Adding Layers
Adding a layer to a model is accomplished with the "add" function. THe following inputs are commonly
used
- inputShape - the shape of the inputs. Generally, this is given as [numberOfInputs].
Note, if the layers added to the model are dense, only the first layer needs to specify
the inputShape. That is because dense layers take all the outputs from the previous layers
as inputs.
- activation - (optional) the activation function to apply to the outputs
- units - the number of outputs
const model = tf.sequential();
model.add(tf.layers.dense({inputShape:[10],units:5,activation:'relu'}));
model.add(tf.layers.dense({units:2,activation:'softmax'}));
Alternatively, the layers can be specified initially in an array.
const model = tf.sequential({
layers:[
tf.layers.dense({inputShape:[10],units:5,activation:'relu'}),
tf.layers.dense({units:2,activation:'softmax'}),
]
});
Try it!
The weights in each layer are initially randomized.
Removing the Bias
By default, tensor flow sequential models include a bias term. This can be turned off by including the following in the arguments to sequential.
{useBias: false}
Dropout Layers
Dropout layers are a technique for reducing
overfitting. The dropout layer randomly deactivates a set of neurons of the
prior layer during traning.
The following code adds a droput layer which will deactivate 40% of the prior layers neurons.
const model = tf.sequential({
layers:[
tf.layers.dense({inputShape:[10],units:5,activation:'relu'}),
tf.layers.dropout({ rate: 0.4 }),
tf.layers.dense({units:2,activation:'softmax'}),
]
});
Dropout reduces the computation load of a network by turning off some of teh neurons. It is also thought the reduce
the problems that arise when two neurons are learning the same pattern.