Member-only story

ML Paper Challenge Day 21 — Dropout: A Simple Way to Prevent Neural Networks from Overfitting

Chun-kit Ho
4 min readMay 2, 2020

--

Day 21: 2020.05.02
Paper: Dropout: A Simple Way to Prevent Neural Networks from Overfitting
Category: Model/Deep Learning/Technique (Dropout)

(Another way to achieve the same effect is to scale up the retained activations by multiplying by 1/p at training time and not modifying the weights at test time.)

Above 2 figures summarise the core idea of Dropout.

Dropout

  • In Dropout, a neural net with n units, can be seen as a collection of 2^n possible thinned neural networks.

--

--

Chun-kit Ho
Chun-kit Ho

Written by Chun-kit Ho

cloud architect@ey | full-stack software engineer | social innovation | certified professional solutions architect in aws & gcp

No responses yet