Synaptic Metaplasticity in Binarized Neural Networks
While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issu...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | While deep neural networks have surpassed human performance in multiple
situations, they are prone to catastrophic forgetting: upon training a new
task, they rapidly forget previously learned ones. Neuroscience studies, based
on idealized tasks, suggest that in the brain, synapses overcome this issue by
adjusting their plasticity depending on their past history. However, such
"metaplastic" behaviours do not transfer directly to mitigate catastrophic
forgetting in deep neural networks. In this work, we interpret the hidden
weights used by binarized neural networks, a low-precision version of deep
neural networks, as metaplastic variables, and modify their training technique
to alleviate forgetting. Building on this idea, we propose and demonstrate
experimentally, in situations of multitask and stream learning, a training
technique that reduces catastrophic forgetting without needing previously
presented data, nor formal boundaries between datasets and with performance
approaching more mainstream techniques with task boundaries. We support our
approach with a theoretical analysis on a tractable task. This work bridges
computational neuroscience and deep learning, and presents significant assets
for future embedded and neuromorphic systems, especially when using novel
nanodevices featuring physics analogous to metaplasticity. |
---|---|
DOI: | 10.48550/arxiv.2003.03533 |