Xlera8

DeepMind papers at ICML 2017 (part one)

Decoupled Neural Interfaces using Synthetic Gradients

Authors: Max Jaderberg, Wojciech Marian Czarnecki, Simon Osindero, Oriol Vinyals, Alex Graves, David Silver, Koray Kavukcuoglu

When training neural networks, the modules (layers) are locked: they can only be updated after backpropagation. We remove this constraint by incorporating a learnt model of error gradients, Synthetic Gradients, which means we can update networks without full backpropagation. We show how this can be applied to feed-forward networks which allows every layer to be trained asynchronously, to RNNs which extends the time over which models can remember, and to multi-network systems to allow communication.

For further details and related work, please see the paper.

Check it out at ICML:

Monday 07 August, 10:30-10:48 @ Darling Harbour Theatre (Talk)

Monday 07 August, 18:30-22:00 PM @ Gallery #1 (Poster)


Source: https://deepmind.com/blog/article/deepmind-papers-icml-2017-part-one

Chat with us

Hi there! How can I help you?