Class Incremental Learning with Large Domain Shift

We address an important and practical problem facing deep-learning-based image classification: class incremental learning with a large domain shift. Most previous efforts on class incremental learning focus on one aspect of the problem, i.e. learning to classify additional new classes (with a little...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024-01, Vol.12, p.1-1
Hauptverfasser: Lee, Kamin, Kim, Hyoeun, Choi, Geunjae, Jeon, Hyejeong, Kwak, Nojun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We address an important and practical problem facing deep-learning-based image classification: class incremental learning with a large domain shift. Most previous efforts on class incremental learning focus on one aspect of the problem, i.e. learning to classify additional new classes (with a little shift). However, in the real world, when new classes are added, the domain changes simultaneously (with a large domain shift). To obtain a model that is robust to these situations, we need to consider incrementally learning not only new labels but also domain-shifted labels. We target a continual and simultaneous shift of class and domain distribution and propose a new incremental learning method named Momentum Contrastive learning enhancing Orthogonality of Negative pairs (MoCo-ON). We employ a momentum encoder framework augmented with rehearsal memory to mitigate the risk of forgetting while leveraging contrastive learning to extract versatile features capable of adapting to the progressively shifting domain. Specifically, when training with a knowledge distillation loss, we introduce a novel supervised contrastive loss designed to closely embed positive pairs of the same class, even in the presence of a substantial domain gap. Additionally, we leverage feature embedding from momentum encoders for exemplar selection, aiming to mitigate the risk of forgetting previously acquired information from earlier tasks. We conduct comprehensive experiments involving inter-domain shifted class incremental learning scenarios using widely adopted datasets commonly employed for studying domain generalization in image classification. Our proposed method consistently outperforms other methods by a significant margin.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3504287