Topology Variations of an Amplifier-based MOS Analog Neural Network Implementation and Weights Optimization
Neural networks are achieving state-of-the-art performance in many applications, from speech recognition to computer vision. A neuron in a multi-layer network needs to multiply each input by its weight, sum the results and perform an activation function. This paper is an extended version of the arti...
Gespeichert in:
Veröffentlicht in: | Analog integrated circuits and signal processing 2021-03, Vol.106 (3), p.635-647 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Neural networks are achieving state-of-the-art performance in many applications, from speech recognition to computer vision. A neuron in a multi-layer network needs to multiply each input by its weight, sum the results and perform an activation function. This paper is an extended version of the article in which we present an implementation of an amplifier-based MOS analog neuron and the optimization of the synaptic weights using in-loop circuit simulations. In addition to the base topology, we present two variations of the original conference paper topology to reduce area and power. MOS transistors operating in the triode region are used as variable resistors to convert the input and weight voltage to proportional input current. To test the analog neuron in full networks, an automatic generator is developed to produce a netlist based on the number of neurons on each layer, inputs, and weights. Simulation results using a CMOS 180 nm technology for all topologies demonstrate the neuron proper transfer function and its functionality while trained in test datasets. |
---|---|
ISSN: | 0925-1030 1573-1979 |
DOI: | 10.1007/s10470-021-01798-y |