Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks

Inspired by detailed modelling of biological neurons, spiking neural networks (SNNs) are investigated as biologically plausible and high-performance models of neural computation. The sparse and binary communication between spiking neurons potentially enables powerful and energy-efficient neural netw...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Nature machine intelligence 2021-10, Vol.3 (10), p.905-913
Hauptverfasser: Yin, Bojian, Corradi, Federico, Bohté, Sander M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Inspired by detailed modelling of biological neurons, spiking neural networks (SNNs) are investigated as biologically plausible and high-performance models of neural computation. The sparse and binary communication between spiking neurons potentially enables powerful and energy-efficient neural networks. The performance of SNNs, however, has remained lacking compared with artificial neural networks. Here we demonstrate how an activity-regularizing surrogate gradient combined with recurrent networks of tunable and adaptive spiking neurons yields the state of the art for SNNs on challenging benchmarks in the time domain, such as speech and gesture recognition. This also exceeds the performance of standard classical recurrent neural networks and approaches that of the best modern artificial neural networks. As these SNNs exhibit sparse spiking, we show that they are theoretically one to three orders of magnitude more computationally efficient compared to recurrent neural networks with similar performance. Together, this positions SNNs as an attractive solution for AI hardware implementations. The use of sparse signals in spiking neural networks, modelled on biological neurons, offers in principle a highly efficient approach for artificial neural networks when implemented on neuromorphic hardware, but new training approaches are needed to improve performance. Using a new type of activity-regularizing surrogate gradient for backpropagation combined with recurrent networks of tunable and adaptive spiking neurons, state-of-the-art performance for spiking neural networks is demonstrated on benchmarks in the time domain.
ISSN:2522-5839
2522-5839
DOI:10.1038/s42256-021-00397-w