4

What does model.add(Dropout(0.4)) mean in Keras?

Does it mean ignoring 40% of the neurons in the Neural Network? OR Does it mean ignoring the neurons that give probability = 0.4?

Noran
  • 758
  • 3
  • 8
  • 21
  • 4
    It means ignoring 40% of the neurons in the particular layer in the Neural Network where you have used dropout. – Ankit Seth Sep 05 '18 at 11:38
  • 1
    https://keras.io/layers/core/#dropout: `rate: float between 0 and 1. Fraction of the input units to drop.` Read Srivastava's 2012 paper on dropout for a basic introduction – Mohammad Athar Sep 05 '18 at 19:49
  • Related: https://datascience.stackexchange.com/questions/37021/why-does-adding-a-dropout-layer-improve-deep-machine-learning-performance-given – n1k31t4 Jun 11 '20 at 20:16

2 Answers2

5

It means that you randomly select 40% of the neurons and set their weights to zero for the forward and backward passes i.e. for one iteration.

Have a look here for some of the reasons and benefits.

Have a look here for all the details of standard Dropout. Important is to notice that the remaining weights are commonly scaled by the value p, as to keep the expected mean value of the weights to be roughly consistent over many iterations. Different deep learning frameworks scale the weights at different different points, but the reason it the same.


From the relevant Keras documentation:

Dropout consists in randomly setting a fraction rate of input units to 0 at each update during training time, which helps prevent overfitting.

n1k31t4
  • 14,663
  • 2
  • 28
  • 49
0

Dropout, as its name suggests, random select and reject (drop off) some of the layers neurons, by which is achieved an ensemble effect (due to random selection - each time different neurons are deactivated, each time different network predicting).

It helps prevent overfitting (like ensemble does).

Specific probability of 0.4 says that 40% of layer nodes are ignored each time.

cugurm
  • 1
  • 1