Convolutional Neural Network With Developmental Memory for Continual Learning

Convolutional neural networks (CNNs) are one of the most successful deep neural networks. Indeed, most of the recent applications related to computer vision are based on CNNs. However, when learning new tasks in a sequential manner, CNNs face catastrophic forgetting: they forget a considerable amoun...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2021-06, Vol.32 (6), p.2691-2705
Hauptverfasser: Park, Gyeong-Moon, Yoo, Sahng-Min, Kim, Jong-Hwan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Convolutional neural networks (CNNs) are one of the most successful deep neural networks. Indeed, most of the recent applications related to computer vision are based on CNNs. However, when learning new tasks in a sequential manner, CNNs face catastrophic forgetting: they forget a considerable amount of previously learned tasks while adapting to novel tasks. To overcome this main barrier to continual learning with CNNs, we introduce developmental memory (DM) into a CNN, continually generating submemory networks to learn important features of individual tasks. A novel training method, referred to here as guided learning (GL), guides the newly generated submemory to become an expert on the new task, eventually improving the performance of the overall network. At the same time, the existing submemories attempt to preserve the knowledge of old tasks. Experiments on image classification tasks show that compared with the state-of-the-art algorithms, the proposed CNN with DM not only improves the classification performance on the new image task but also leads to less forgetting of previous image tasks to facilitate continual learning.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2020.3007548