Using Floating-Gate Memory to Train Ideal Accuracy Neural Networks

Floating-gate silicon-oxygen-nitrogen-oxygen-silicon (SONOS) transistors can be used to train neural networks to ideal accuracies that match those of floating-point digital weights on the MNIST handwritten digit data set when using multiple devices to represent a weight or within 1% of ideal accurac...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal on exploratory solid-state computational devices and circuits 2019-06, Vol.5 (1), p.52-57
Hauptverfasser: Agarwal, Sapan, Garland, Diana, Niroula, John, Jacobs-Gedrim, Robin B., Hsia, Alex, Van Heukelom, Michael S., Fuller, Elliot, Draper, Bruce, Marinella, Matthew J.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Floating-gate silicon-oxygen-nitrogen-oxygen-silicon (SONOS) transistors can be used to train neural networks to ideal accuracies that match those of floating-point digital weights on the MNIST handwritten digit data set when using multiple devices to represent a weight or within 1% of ideal accuracy when using a single device. This is enabled by operating devices in the subthreshold regime, where they exhibit symmetric write nonlinearities. A neural training accelerator core based on SONOS with a single device per weight would increase energy efficiency by 120×, operate 2.1× faster, and require 5× lower area than an optimized SRAM-based ASIC.
ISSN:2329-9231
2329-9231
DOI:10.1109/JXCDC.2019.2902409