Ultra-Low-Energy Mixed-Signal IC Implementing Encoded Neural Networks
Encoded Neural Networks (ENNs) associate low-complexity algorithm with a storage capacity much larger than Hopfield Neural Networks (HNNs) for the same number of nodes. Moreover, they have a lower density than HNNs in terms of connections, allowing a low-complexity circuit integration. The implement...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on circuits and systems. I, Regular papers Regular papers, 2016-11, Vol.63 (11), p.1974-1985 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Encoded Neural Networks (ENNs) associate low-complexity algorithm with a storage capacity much larger than Hopfield Neural Networks (HNNs) for the same number of nodes. Moreover, they have a lower density than HNNs in terms of connections, allowing a low-complexity circuit integration. The implementation of such a network requires low-complexity elements to take complete advantage of the assets of the model. This paper proposes an analog implementation of the ENNs. It is shown that this type of implementation is suitable for building network of thousands of nodes. To validate the proposed implementation, a prototype ENN of 30 computation nodes is designed, fabricated and tested on-chip for the ST 65-nm 1-V supply complementary metal-oxide silicon (CMOS) process. The circuit shows decoding performance similar to that of the theoretical model, and decodes a message in 58 ns. Moreover, the entire network occupies a silicon area of 16470 μm 2 and consumes 145 μW, yielding a measured energy consumption per synaptic event per computation node of 68 fJ. |
---|---|
ISSN: | 1549-8328 1558-0806 |
DOI: | 10.1109/TCSI.2016.2600663 |