SCREAM: Knowledge sharing and compact representation for class incremental learning

Methods based on dynamic structures are effective in addressing catastrophic forgetting on Class-incremental learning (CIL). However, they often isolate sub-networks and overlook the integration of overall information, resulting in a performance decline. To overcome this limitation, we recognize the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Information processing & management 2024-05, Vol.61 (3), p.103629, Article 103629
Hauptverfasser: Feng, Zhikun, Zhou, Mian, Gao, Zan, Stefanidis, Angelos, Sui, Zezhou
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Methods based on dynamic structures are effective in addressing catastrophic forgetting on Class-incremental learning (CIL). However, they often isolate sub-networks and overlook the integration of overall information, resulting in a performance decline. To overcome this limitation, we recognize the importance of knowledge sharing among sub-networks. On the basis of dynamic network, we established a novel two-stage CIL method called SCREAM that includes an Expandable Network (EN) Learning Stage and a Compact Representation (CR) Stage: (1) design a clustering loss function for EN, aggregating related instances and promoting information sharing; (2) design dynamic weight alignment to alleviate the classifier’s bias towards new class knowledge; and (3) design a balanced decoupled distillation for CR, mitigating the impact of the long-tail effect during multiple compressions. To validate the performance of SCREAM, we use 3 widely used datasets and set different Buffersize (replay-buffer) for comparison with the current state-of-the-art models.The result show that on CIFAR-100 ImageNet-100/1000 and Tiny-ImageNet achieve an average accuracy exceeding 2.46%, 1.22% and 1.52%, respectively. When using a smaller buffersize, SCREAM also achieves an average accuracy exceeding 4.60%. Furthermore, SCREAM shows good performance in terms of Resources needed.
ISSN:0306-4573
1873-5371
DOI:10.1016/j.ipm.2023.103629