Self-Cooperation Knowledge Distillation for Novel Class Discovery
Novel Class Discovery (NCD) aims to discover unknown and novel classes in an unlabeled set by leveraging knowledge already learned about known classes. Existing works focus on instance-level or class-level knowledge representation and build a shared representation space to achieve performance improv...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Novel Class Discovery (NCD) aims to discover unknown and novel classes in an
unlabeled set by leveraging knowledge already learned about known classes.
Existing works focus on instance-level or class-level knowledge representation
and build a shared representation space to achieve performance improvements.
However, a long-neglected issue is the potential imbalanced number of samples
from known and novel classes, pushing the model towards dominant classes.
Therefore, these methods suffer from a challenging trade-off between reviewing
known classes and discovering novel classes. Based on this observation, we
propose a Self-Cooperation Knowledge Distillation (SCKD) method to utilize each
training sample (whether known or novel, labeled or unlabeled) for both review
and discovery. Specifically, the model's feature representations of known and
novel classes are used to construct two disjoint representation spaces. Through
spatial mutual information, we design a self-cooperation learning to encourage
model learning from the two feature representation spaces from itself.
Extensive experiments on six datasets demonstrate that our method can achieve
significant performance improvements, achieving state-of-the-art performance. |
---|---|
DOI: | 10.48550/arxiv.2407.01930 |