Correlations in high dimensional or asymmetric data sets: Hebbian neuronal processing

The Hebbian neural learning algorithm that implements Principal Component Analysis (PCA) can be extended for the analysis of more realistic forms of neural data by including higher than two-channel correlations and non-Euclidean 1 p metrics. Maximizing a dth rank tensor form which correlates d chann...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural networks 1991, Vol.4 (3), p.337-347
Hauptverfasser: Softky, William R., Kammen, Daniel M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The Hebbian neural learning algorithm that implements Principal Component Analysis (PCA) can be extended for the analysis of more realistic forms of neural data by including higher than two-channel correlations and non-Euclidean 1 p metrics. Maximizing a dth rank tensor form which correlates d channels is equivalent to raising the exponential order of variance correlation from 2 to d in the algorithm that implements PCA. Simulations suggest that a generalized version of Oja's PCA neuron can detect such a dth order principal component. Arguments from biology and pattern recognition suggest that neural data in general is not symmetric about its mean; performing PCA with an implicit 1 l metric rather than the Euclidean metric weights exponentially distributed vectors according to their probability, as does a highly nonlinear Hebb rule. The correlation order d and the 1 p metric exponent p were each roughly constant for each of several Hebb rules simulated. High-order correlation analysis may prove increasingly useful as data from large networks of cells engaged in information processing becomes available.
ISSN:0893-6080
1879-2782
DOI:10.1016/0893-6080(91)90070-L