Oja's plasticity rule overcomes several challenges of training neural networks under biological constraints
There is a large literature on the similarities and differences between biological neural circuits and deep artificial neural networks (DNNs). However, modern training of DNNs relies on several engineering tricks such as data batching, normalization, adaptive optimizers, and precise weight initializ...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2024-10 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | There is a large literature on the similarities and differences between biological neural circuits and deep artificial neural networks (DNNs). However, modern training of DNNs relies on several engineering tricks such as data batching, normalization, adaptive optimizers, and precise weight initialization. Despite their critical role in training DNNs, these engineering tricks are often overlooked when drawing parallels between biological and artificial networks, potentially due to a lack of evidence for their direct biological implementation. In this study, we show that Oja's plasticity rule partly overcomes the need for some engineering tricks. Specifically, under difficult, but biologically realistic learning scenarios such as online learning, deep architectures, and sub-optimal weight initialization, Oja's rule can substantially improve the performance of pure backpropagation. Our results demonstrate that simple synaptic plasticity rules can overcome challenges to learning that are typically overcome using less biologically plausible approaches when training DNNs. |
---|---|
ISSN: | 2331-8422 |