Optimal storage capacity of neural networks at finite temperatures

Gardner's analysis of the optimal storage capacity of neural networks is extended to study finite-temperature effects. The typical volume of the space of interactions is calculated for strongly-diluted networks as a function of the storage ratio \(\alpha\), temperature \(T\), and the tolerance...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 1993-06
Hauptverfasser: Shimi, G M, Kim, D, Choi, M Y
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Gardner's analysis of the optimal storage capacity of neural networks is extended to study finite-temperature effects. The typical volume of the space of interactions is calculated for strongly-diluted networks as a function of the storage ratio \(\alpha\), temperature \(T\), and the tolerance parameter \(m\), from which the optimal storage capacity \(\alpha_c\) is obtained as a function of \(T\) and \(m\). At zero temperature it is found that \(\alpha_c = 2\) regardless of \(m\) while \(\alpha_c\) in general increases with the tolerance at finite temperatures. We show how the best performance for given \(\alpha\) and \(T\) is obtained, which reveals a first-order transition from high-quality performance to low-quality one at low temperatures. An approximate criterion for recalling, which is valid near \(m=1\), is also discussed.
ISSN:2331-8422