Robust neural networks using stochastic resonance neurons

Various successful applications of deep artificial neural networks are effectively facilitated by the possibility to increase the number of layers and neurons in the network at the expense of the growing computational complexity. Increasing computational complexity to improve performance makes hardw...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Communications engineering 2024-11, Vol.3 (1), p.169-7, Article 169
Hauptverfasser: Manuylovich, Egor, Argüello Ron, Diego, Kamalian-Kopae, Morteza, Turitsyn, Sergei K.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Various successful applications of deep artificial neural networks are effectively facilitated by the possibility to increase the number of layers and neurons in the network at the expense of the growing computational complexity. Increasing computational complexity to improve performance makes hardware implementation more difficult and directly affects both power consumption and the accumulation of signal processing latency, which are critical issues in many applications. Power consumption can be potentially reduced using analog neural networks, the performance of which, however, is limited by noise aggregation. Following the idea of physics-inspired machine learning, we propose here a type of neural network using stochastic resonances as a dynamic nonlinear node and demonstrate the possibility of considerably reducing the number of neurons required for a given prediction accuracy. We also observe that the performance of such neural networks is more robust against the impact of noise in the training data compared to conventional networks. Manuylovich and colleagues propose the use of stochastic resonances in neural networks as dynamic nonlinear nodes. They demonstrate the possibility of reducing the number of neurons for a given prediction accuracy and observe that the performance of such neural networks can be more robust against the impact of noise in the training data compared to the conventional networks.
ISSN:2731-3395
2731-3395
DOI:10.1038/s44172-024-00314-0