Member-only story

Day 22: 2020.05.03
Paper: Batch Normalisation: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Category: Model/Deep Learning/Technique (Batch Normalisation)

Batch Normalisation

Background:

Internal covariate shift: The distribution of each layer’s inputs changes during training, as the parameters of the previous layers change.

Internal covariate shift -> slows down the training by requiring lower learning rates and careful parameter initialisation, and makes it notoriously hard to train models with saturating nonlinearities

Method — Batch Normalisation:

  • making normalisation a part of the model architecture and performing the normalisation for each…

--

--

Chun-kit Ho
Chun-kit Ho

Written by Chun-kit Ho

cloud architect@ey | full-stack software engineer | social innovation | certified professional solutions architect in aws & gcp

No responses yet