Exploration of Bitflip's Effect on DNN Accuracy in Plaintext and Ciphertext
Neural Networks (NNs) are increasingly deployed to solve complex classification problems and produce accurate results on reliable systems. However, its accuracy quickly degrades in the presence of bit flips from memory errors or targeted attacks on DRAM main memory. Prior work has shown that a few b...
Gespeichert in:
Veröffentlicht in: | IEEE MICRO 2023-05, p.1-11 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Neural Networks (NNs) are increasingly deployed to solve complex classification problems and produce accurate results on reliable systems. However, its accuracy quickly degrades in the presence of bit flips from memory errors or targeted attacks on DRAM main memory. Prior work has shown that a few bit errors significantly reduce NN accuracies, but it is unclear which bits have an outsized impact on network accuracy and why. This paper first investigates the relationship of the number representation for NN parameters with the impacts of bit flips on NN accuracy. We then explore the Bit Flip Detection (BFD) framework- four software-based error detectors that detect bit flips independent of NN topology. We discuss exciting findings and evaluate the various detectors' efficacy, characteristics, and trade-offs. |
---|---|
ISSN: | 0272-1732 |
DOI: | 10.1109/MM.2023.3273115 |