Member-only story
ML Paper Challenge Day 20— Improving neural networks by preventing co-adaptation of feature detectors
4 min readMay 1, 2020
Day 20: 2020.05.01
Paper: Improving neural networks by preventing co-adaptation of feature detectors
Category: Model/Deep Learning/Technique (Dropout)
Dropout
- randomly omitting half of the feature detectors on each training case
- prevents complex co-adaptations in which a feature detector is only helpful in the context of several other specific feature detectors
- each neuron learns to detect a feature that is generally helpful for producing the correct answer given the combinatorially large variety of internal contexts in which it must operate
- a very efficient way of performing model averaging with neural networks
- Random dropout makes it possible to train a huge number of different networks in a reasonable time