Member-only story

Day 25: 2020.05.06
Paper: Decoupled neural interfaces using synthetic gradients
Category: Model/Deep Learning/Training Method

A really interesting and innovative method of training a neural network which can decouple each layer in training.

Decoupled Neural Interfaces

Background

Traditionally, there are 3 kinds of locking in neural network computation.

  1. Forward Locking — no module can process its incoming data before the previous nodes in the directed forward graph have executed
  2. Update Locking — no module can be updated before all dependent modules have executed in forwards mode
  3. Backwards Locking — no module can be updated before all dependent modules have executed in both forwards mode and backwards mode.

--

--

Chun-kit Ho
Chun-kit Ho

Written by Chun-kit Ho

cloud architect@ey | full-stack software engineer | social innovation | certified professional solutions architect in aws & gcp

No responses yet