Optimal storage capacity of neural networks at finite temperatures

Gardner's analysis of the optimal storage capacity of neural networks is extended to study finite-temperature effects. The typical volume of the space of interactions is calculated for strongly-diluted networks as a function of the storage ratio $\alpha$, temperature $T$, and the tolerance para...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Shimi, G. M, Kim, D, Choi, M. Y
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Gardner's analysis of the optimal storage capacity of neural networks is extended to study finite-temperature effects. The typical volume of the space of interactions is calculated for strongly-diluted networks as a function of the storage ratio $\alpha$, temperature $T$, and the tolerance parameter $m$, from which the optimal storage capacity $\alpha_c$ is obtained as a function of $T$ and $m$. At zero temperature it is found that $\alpha_c = 2$ regardless of $m$ while $\alpha_c$ in general increases with the tolerance at finite temperatures. We show how the best performance for given $\alpha$ and $T$ is obtained, which reveals a first-order transition from high-quality performance to low-quality one at low temperatures. An approximate criterion for recalling, which is valid near $m=1$, is also discussed.
DOI:10.48550/arxiv.cond-mat/9306032