Designing asymmetric Hopfield-type associative memory with higher order hamming stability

The problem of optimal asymmetric Hopfield-type associative memory (HAM) design based on perceptron-type learning algorithms is considered. It is found that most of the existing methods considered the design problem as either 1) finding optimal hyperplanes according to normal distance from the proto...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2005-11, Vol.16 (6), p.1464-1476
Hauptverfasser: Donq-Liang Lee, Chuang, T.C.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The problem of optimal asymmetric Hopfield-type associative memory (HAM) design based on perceptron-type learning algorithms is considered. It is found that most of the existing methods considered the design problem as either 1) finding optimal hyperplanes according to normal distance from the prototype vectors to the hyperplane surface or 2) obtaining weight matrix W=[w/sub ij/] by solving a constraint optimization problem. In this paper, we show that since the state space of the HAM consists of only bipolar patterns, i.e., V=(v/sub 1/,v/sub 2/,...,v/sub N/)/sup T//spl isin/{-1,+1}/sup N/, the basins of attraction around each prototype (training) vector should be expanded by using Hamming distance measure. For this reason, in this paper, the design problem is considered from a different point of view. Our idea is to systematically increase the size of the training set according to the desired basin of attraction around each prototype vector. We name this concept the higher order Hamming stability and show that conventional minimum-overlap algorithm can be modified to incorporate this concept. Experimental results show that the recall capability as well as the number of spurious memories are all improved by using the proposed method. Moreover, it is well known that setting all self-connections w/sub ii//spl forall/i to zero has the effect of reducing the number of spurious memories in state space. From the experimental results, we find that the basin width around each prototype vector can be enlarged by allowing nonzero diagonal elements on learning of the weight matrix W. If the magnitude of w/sub ii/ is small for all i, then the condition w/sub ii/=0/spl forall/i can be relaxed without seriously affecting the number of spurious memories in the state space. Therefore, the method proposed in this paper can be used to increase the basin width around each prototype vector with the cost of slightly increasing the number of spurious memories in the state space.
ISSN:1045-9227
2162-237X
1941-0093
2162-2388
DOI:10.1109/TNN.2005.852863