Neuromimetic metaplasticity for adaptive continual learning
Conventional intelligent systems based on deep neural network (DNN) models encounter challenges in achieving human-like continual learning due to catastrophic forgetting. Here, we propose a metaplasticity model inspired by human working memory, enabling DNNs to perform catastrophic forgetting-free c...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Conventional intelligent systems based on deep neural network (DNN) models
encounter challenges in achieving human-like continual learning due to
catastrophic forgetting. Here, we propose a metaplasticity model inspired by
human working memory, enabling DNNs to perform catastrophic forgetting-free
continual learning without any pre- or post-processing. A key aspect of our
approach involves implementing distinct types of synapses from stable to
flexible, and randomly intermixing them to train synaptic connections with
different degrees of flexibility. This strategy allowed the network to
successfully learn a continuous stream of information, even under unexpected
changes in input length. The model achieved a balanced tradeoff between memory
capacity and performance without requiring additional training or structural
modifications, dynamically allocating memory resources to retain both old and
new information. Furthermore, the model demonstrated robustness against data
poisoning attacks by selectively filtering out erroneous memories, leveraging
the Hebb repetition effect to reinforce the retention of significant data. |
---|---|
DOI: | 10.48550/arxiv.2407.07133 |