Associative content-addressable networks with exponentially many robust stable states
The brain must robustly store a large number of memories, corresponding to the many events encountered over a lifetime. However, the number of memory states in existing neural network models either grows weakly with network size or recall fails catastrophically with vanishingly little noise. We cons...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The brain must robustly store a large number of memories, corresponding to
the many events encountered over a lifetime. However, the number of memory
states in existing neural network models either grows weakly with network size
or recall fails catastrophically with vanishingly little noise. We construct an
associative content-addressable memory with exponentially many stable states
and robust error-correction. The network possesses expander graph connectivity
on a restricted Boltzmann machine architecture. The expansion property allows
simple neural network dynamics to perform at par with modern error-correcting
codes. Appropriate networks can be constructed with sparse random connections,
glomerular nodes, and associative learning using low dynamic-range weights.
Thus, sparse quasi-random structures---characteristic of important
error-correcting codes---may provide for high-performance computation in
artificial neural networks and the brain. |
---|---|
DOI: | 10.48550/arxiv.1704.02019 |