Discrimination Correction and Balance for Class-Incremental Learning

Catastrophic forgetting is the key role in incremental learning, since the model gains poor quality for old classes than new classes with continuously incoming tasks and less storage. In order to solve the balance between model plasticity and stability, this paper proposes a novel class incremental...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of physics. Conference series 2022-09, Vol.2347 (1), p.12024
Hauptverfasser: Wang, Ruixiang, Luo, Yong, Ren, Yitao, Mao, Keming
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Catastrophic forgetting is the key role in incremental learning, since the model gains poor quality for old classes than new classes with continuously incoming tasks and less storage. In order to solve the balance between model plasticity and stability, this paper proposes a novel class incremental learning method DCBIL, which adopts discrimination correction and balance to avoid model bias. First, three components Conservative, Balancer and Opener with identical structure are designed. Then, Conservative and Opener have a bias to old and new classes respectively. And Balancer modifies the fully connected layer and replaces the base model for the next training. Moreover, a stochastic perturbation probability synergy is incorporated for final output fusion. Experimental evaluations on Mnist, Cifar-10, Cifar-100 and TinyImageNet demonstrate the effectiveness of DCBIL. DCBIL is portable for it can be used as a plug-in to strengthen the performance of existing incremental learning model.
ISSN:1742-6588
1742-6596
DOI:10.1088/1742-6596/2347/1/012024