Incremental learning with a homeostatic self-organizing neural model

We present a new self-organized neural model that we term resilient self-organizing tissue (ReST), which can be run as a convolutional neural network, possesses a c ∞ energy function as well as a probabilistic interpretation of neural activities. The latter arises from the constraint of lognormal ac...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural computing & applications 2020-12, Vol.32 (24), p.18101-18121
1. Verfasser: Gepperth, Alexander
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present a new self-organized neural model that we term resilient self-organizing tissue (ReST), which can be run as a convolutional neural network, possesses a c ∞ energy function as well as a probabilistic interpretation of neural activities. The latter arises from the constraint of lognormal activity distribution over time that is enforced during ReST learning. The principal message of this article is that self-organized models in general are, due to their localized learning rule that updates only those units close to the best-matching unit, ideal representation learners for incremental learning architectures. We present such an architecture that uses ReST layers as a building block, benchmark its performance w.r.t. incremental learning in three real-world visual classification problems, and justify the mechanisms implemented in the architecture by dedicated experiments.
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-019-04112-0