An 8-Bit in Resistive Memory Computing Core With Regulated Passive Neuron and Bitline Weight Mapping

The rapid development of artificial intelligence (AI) and Internet of Things (IoT) increase the requirement for edge computing with low power and relatively high processing speed devices. The computing-in-memory (CIM) schemes based on emerging resistive nonvolatile memory (NVM) show great potential...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on very large scale integration (VLSI) systems 2022-04, Vol.30 (4), p.379-391
Hauptverfasser: Zhang, Yewei, Huang, Kejie, Xiao, Rui, Wang, Bo, Xu, Yanfeng, Fan, Jicong, Shen, Haibin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The rapid development of artificial intelligence (AI) and Internet of Things (IoT) increase the requirement for edge computing with low power and relatively high processing speed devices. The computing-in-memory (CIM) schemes based on emerging resistive nonvolatile memory (NVM) show great potential in reducing the power consumption for AI computing. However, the inconsistency of the NVM may significantly degenerate the performance of the neural network. In this article, we propose a low power resistive RAM (RRAM)-based CIM core to not only achieve high computing efficiency but also greatly enhance the robustness by bit line (BL) regulator and BL weight mapping algorithm. The simulation results show that the power consumption of our proposed 8-bit CIM core is only 12.6 mW ( 256\times 256 at 8b). The spurious-free dynamic range (SFDR) and signal to noise and distortion ratio (SNDR) of the CIM core achieve 62.64 and 45.92 dB, respectively. The proposed BL weight mapping scheme improves the top-1 accuracy by 2.46% and 3.47% for AlexNet and VGG16 on ImageNet Large Scale Visual Recognition Competition 2012 (ILSVRC 2012) in 8-bit mode, respectively.
ISSN:1063-8210
1557-9999
DOI:10.1109/TVLSI.2022.3140395