What does model.add(Dropout(0.4)) mean in Keras?
Does it mean ignoring 40% of the neurons in the Neural Network? OR Does it mean ignoring the neurons that give probability = 0.4?
What does model.add(Dropout(0.4)) mean in Keras?
Does it mean ignoring 40% of the neurons in the Neural Network? OR Does it mean ignoring the neurons that give probability = 0.4?
It means that you randomly select 40% of the neurons and set their weights to zero for the forward and backward passes i.e. for one iteration.
Have a look here for some of the reasons and benefits.
Have a look here for all the details of standard Dropout. Important is to notice that the remaining weights are commonly scaled by the value p, as to keep the expected mean value of the weights to be roughly consistent over many iterations. Different deep learning frameworks scale the weights at different different points, but the reason it the same.
From the relevant Keras documentation:
Dropout consists in randomly setting a fraction rate of input units to 0 at each update during training time, which helps prevent overfitting.
Dropout, as its name suggests, random select and reject (drop off) some of the layers neurons, by which is achieved an ensemble effect (due to random selection - each time different neurons are deactivated, each time different network predicting).
It helps prevent overfitting (like ensemble does).
Specific probability of 0.4 says that 40% of layer nodes are ignored each time.