Member-only story

Day 20: 2020.05.01
Paper: Improving neural networks by preventing co-adaptation of feature detectors
Category: Model/Deep Learning/Technique (Dropout)

Dropout

  • randomly omitting half of the feature detectors on each training case
  • prevents complex co-adaptations in which a feature detector is only helpful in the context of several other specific feature detectors
  • each neuron learns to detect a feature that is generally helpful for producing the correct answer given the combinatorially large variety of internal contexts in which it must operate
  • a very efficient way of performing model averaging with neural networks
  • Random dropout makes it possible to train a huge number of different networks in a reasonable time

--

--

Chun-kit Ho
Chun-kit Ho

Written by Chun-kit Ho

cloud architect@ey | full-stack software engineer | social innovation | certified professional solutions architect in aws & gcp

No responses yet