Incremental Learning Based on Angle Constraints

With the rapid growth of the Internet, it has become easy to obtain new data for many application domains. However, when adding new data to the current system of artificial neural networks (ANNs) to learn, it can cause the network to completely forget what it has learned before, which is called cata...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of physics. Conference series 2021-04, Vol.1880 (1), p.12030
Hauptverfasser: Zhou, Yang, Qin, Yunbai, Jiang, F., Zheng, Kunkun, Cen, Mingcan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:With the rapid growth of the Internet, it has become easy to obtain new data for many application domains. However, when adding new data to the current system of artificial neural networks (ANNs) to learn, it can cause the network to completely forget what it has learned before, which is called catastrophic forgetting. The main reason for these problems is the inability of ANNs to balance new classes with old ones. Therefore, to address the challenge of learning new knowledge while not suffering from catastrophic forgetting, some incremental learning algorithms have been proposed to alleviate. This paper proposes features that balance new classes with old classes by using angular distillation. And some exemplars from the old classes are retained to improve the performance on the old data. The effectiveness of our algorithm is demonstrated on CIFAR-100 dataset.
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/1880/1/012030