Expanding memory in recurrent spiking networks

Recurrent spiking neural networks (RSNNs) are notoriously difficult to train because of the vanishing gradient problem that is enhanced by the binary nature of the spikes. In this paper, we review the ability of the current state-of-the-art RSNNs to solve long-term memory tasks, and show that they h...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2023-10
Hauptverfasser: Balafrej, Ismael, Alibart, Fabien, Rouat, Jean
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recurrent spiking neural networks (RSNNs) are notoriously difficult to train because of the vanishing gradient problem that is enhanced by the binary nature of the spikes. In this paper, we review the ability of the current state-of-the-art RSNNs to solve long-term memory tasks, and show that they have strong constraints both in performance, and for their implementation on hardware analog neuromorphic processors. We present a novel spiking neural network that circumvents these limitations. Our biologically inspired neural network uses synaptic delays, branching factor regularization and a novel surrogate derivative for the spiking function. The proposed network proves to be more successful in using the recurrent connections on memory tasks.
ISSN:2331-8422