Efficient Multiplier-less Perceptron Architecture for Realization of Multilayer Perceptron Inference Models

Artificial neural networks (ANNs) have gained considerable interest in industrial and academic research due to their vast applicability areas; consequently, many real-time applications using ANNs have been developed. In this paper, we propose an efficient multiplier-less realization architecture usi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Circuits, systems, and signal processing systems, and signal processing, 2023-08, Vol.42 (8), p.4637-4668
Hauptverfasser: Tripathi, Raghuvendra Pratap, Tiwari, Manish, Dhawan, Amit, Jha, Sumit Kumar, Singh, Arun Kumar
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Artificial neural networks (ANNs) have gained considerable interest in industrial and academic research due to their vast applicability areas; consequently, many real-time applications using ANNs have been developed. In this paper, we propose an efficient multiplier-less realization architecture using distributive arithmetic (DA) for the perceptrons to realize MLP inference models. An efficient organization of the activation function unit using a domain-specific approach is proposed to save a significant number of resources. Further, a generalized ROM-based efficient realization architecture using DA is proposed for the calculation of the weighted sum. The architecture is further optimized for the area by exploiting the symmetry properties. Several previously reported MLP reference models are used for the purpose of comparison. Finally, with the help of application-specific integrated circuit synthesis results, we demonstrate that our proposed method delivers much better performance parameters.
ISSN:0278-081X
1531-5878
DOI:10.1007/s00034-023-02318-1