Member-only story
ML Paper Challenge Day 21 — Dropout: A Simple Way to Prevent Neural Networks from Overfitting
4 min readMay 2, 2020
Day 21: 2020.05.02
Paper: Dropout: A Simple Way to Prevent Neural Networks from Overfitting
Category: Model/Deep Learning/Technique (Dropout)
(Another way to achieve the same effect is to scale up the retained activations by multiplying by 1/p at training time and not modifying the weights at test time.)
Above 2 figures summarise the core idea of Dropout.
Dropout
- In Dropout, a neural net with n units, can be seen as a collection of 2^n possible thinned neural networks.