A neural network associative memory for handwritten character recognition using multiple Chua characters

A neural network architecture and learning algorithm for associative memory storage of analog patterns, continuous sequences, and chaotic attractors in the same network is described. System performance using many different chaotic attractors from the family of Chua attractors implemented by the Chua...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on circuits and systems. 2, Analog and digital signal processing Analog and digital signal processing, 1993-10, Vol.40 (10), p.667-674
Hauptverfasser: Baird, B., Hirsch, M.W., Eeckman, F.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A neural network architecture and learning algorithm for associative memory storage of analog patterns, continuous sequences, and chaotic attractors in the same network is described. System performance using many different chaotic attractors from the family of Chua attractors implemented by the Chua hardware circuit is investigated in an application to the problem of real time handwritten digit recognition. Several of these attractors outperform the previously studied Lorenz attractor system in terms of accuracy and speed of convergence. In the normal form projection algorithm, which was developed at Berkeley for associative memory storage of dynamic attractors, a matrix inversion determines network weights, given prototype patterns to be stored. There are N units of capacity in an N node network with 3N/sup 2/ weights. It costs one unit per static attractor, two per Fourier component of each periodic trajectory, and at least three per chaotic attractor. There are no spurious attractors, and for periodic attractors there is a Lyapunov function in a special coordinate system which governs the approach of transient states to stored trajectories. Unsupervised or supervised incremental learning algorithms for pattern classification, such as competitive learning or boot-strap Widrow-Hoff can easily be implemented. The architecture can be "folded" into a recurrent network with higher order weights that can be used as a model of cortex that stores oscillatory and chaotic attractors by a Hebb rule. A novel computing architecture has been constructed of recurrently interconnected associative memory modules of this type. Architectural variations employ selective synchronization of modules with chaotic attractors that communicate by broadspectrum chaotic signals to control the flow of computation.< >
ISSN:1057-7130
1558-125X
DOI:10.1109/82.246169