A Memristive Spiking Neural Network Circuit With Selective Supervised Attention Algorithm

Spiking neural networks (SNNs) are biologically plausible and computationally powerful. The current computing systems based on the von Neumann architecture are almost the hardware basis for the implementation of SNNs. However, performance bottlenecks in computing speed, cost, and energy consumption...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on computer-aided design of integrated circuits and systems 2023-08, Vol.42 (8), p.2604-2617
Hauptverfasser: Deng, Zekun, Wang, Chunhua, Lin, Hairong, Sun, Yichuang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Spiking neural networks (SNNs) are biologically plausible and computationally powerful. The current computing systems based on the von Neumann architecture are almost the hardware basis for the implementation of SNNs. However, performance bottlenecks in computing speed, cost, and energy consumption hinder the hardware development of SNNs. Therefore, efficient non von Neumann hardware computing systems for SNNs remain to be explored. In this article, a selective supervised algorithm for spiking neurons (SNs) inspired by the selective attention mechanism is proposed, and a memristive SN circuit as well as a memristive SNN circuit based on the proposed algorithm are designed. The memristor realizes the learning and memory of the synaptic weight. The proposed algorithm includes a top-down (TD) selective supervision method and a bottom-up (BU) selective supervision method. Compared with other supervised algorithms, the proposed algorithm has excellent performance on sequence learning. Moreover, TD and BU attention encoding circuits are designed to provide the hardware foundation for encoding external stimuli into TD and BU attention spikes, respectively. The proposed memristive SNN circuit can perform classification on the MNIST dataset and the Fashion-MNIST dataset with superior accuracy after learning a small number of labeled samples, which greatly reduces the cost of manual annotation and improves the supervised learning efficiency of the memristive SNN circuit.
ISSN:0278-0070
1937-4151
DOI:10.1109/TCAD.2022.3228896