Benchmarking Decoupled Neural Interfaces with Synthetic Gradients
Artifical Neural Networks are a particular class of learning systems modeled after biological neural functions with an interesting penchant for Hebbian learning, that is "neurons that wire together, fire together". However, unlike their natural counterparts, artificial neural networks have...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Artifical Neural Networks are a particular class of learning systems modeled
after biological neural functions with an interesting penchant for Hebbian
learning, that is "neurons that wire together, fire together". However, unlike
their natural counterparts, artificial neural networks have a close and
stringent coupling between the modules of neurons in the network. This coupling
or locking imposes upon the network a strict and inflexible structure that
prevent layers in the network from updating their weights until a full
feed-forward and backward pass has occurred. Such a constraint though may have
sufficed for a while, is now no longer feasible in the era of very-large-scale
machine learning, coupled with the increased desire for parallelization of the
learning process across multiple computing infrastructures. To solve this
problem, synthetic gradients (SG) with decoupled neural interfaces (DNI) are
introduced as a viable alternative to the backpropagation algorithm. This paper
performs a speed benchmark to compare the speed and accuracy capabilities of
SG-DNI as opposed to a standard neural interface using multilayer perceptron
MLP. SG-DNI shows good promise, in that it not only captures the learning
problem, it is also over 3-fold faster due to it asynchronous learning
capabilities. |
---|---|
DOI: | 10.48550/arxiv.1712.08314 |