Rescuing RRAM-Based Computing From Static and Dynamic Faults
Emerging resistive random access memory (RRAM) has shown the great potential of in-memory processing capability, and thus attracts considerable research interests in accelerating memory-intensive applications, such as neural networks (NNs). However, the accuracy of RRAM-based NN computing can degrad...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on computer-aided design of integrated circuits and systems 2021-10, Vol.40 (10), p.2049-2062 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Emerging resistive random access memory (RRAM) has shown the great potential of in-memory processing capability, and thus attracts considerable research interests in accelerating memory-intensive applications, such as neural networks (NNs). However, the accuracy of RRAM-based NN computing can degrade significantly, due to the intrinsic statistical variations of the resistance of RRAM cells. In this article, we propose SIGHT, a synergistic algorithm-architecture fault-tolerant framework, to holistically address this issue. Specifically, we consider three major types of faults for RRAM computing: 1) nonlinear resistance distribution; 2) static variation; and 3) dynamic variation. From the algorithm level, we propose a resistance-aware quantization to compel the NN parameters to follow the exact nonlinear resistance distribution as RRAM, and introduce an input regulation technique to compensate for RRAM variations. We also propose a selective weight refreshing scheme to address the dynamic variation issue that occurs at runtime. From the architecture level, we propose a general and low-cost architecture accordingly for supporting our fault-tolerant scheme. Our evaluation demonstrates almost no accuracy loss for our three fault-tolerant algorithms, and the proposed SIGHT architecture incurs performance overhead as little as 7.14%. |
---|---|
ISSN: | 0278-0070 1937-4151 |
DOI: | 10.1109/TCAD.2020.3037316 |