Member-only story
ML Paper Challenge Day 25 — Decoupled Neural Interfaces using Synthetic Gradients
2 min readMay 7, 2020
Day 25: 2020.05.06
Paper: Decoupled neural interfaces using synthetic gradients
Category: Model/Deep Learning/Training Method
A really interesting and innovative method of training a neural network which can decouple each layer in training.
Decoupled Neural Interfaces
Background
Traditionally, there are 3 kinds of locking in neural network computation.
- Forward Locking — no module can process its incoming data before the previous nodes in the directed forward graph have executed
- Update Locking — no module can be updated before all dependent modules have executed in forwards mode
- Backwards Locking — no module can be updated before all dependent modules have executed in both forwards mode and backwards mode.