Improving diversity and discriminability based implicit contrastive learning for unsupervised domain adaptation
In unsupervised domain adaptation (UDA), knowledge is transferred from label-rich source domains to relevant but unlabeled target domains. Current most popular state-of-the-art works suggest that performing domain alignment from the class perspective can alleviate domain shift. However, most of them...
Gespeichert in:
Veröffentlicht in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2024-10, Vol.54 (20), p.10007-10017 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In unsupervised domain adaptation (UDA), knowledge is transferred from label-rich source domains to relevant but unlabeled target domains. Current most popular state-of-the-art works suggest that performing domain alignment from the class perspective can alleviate domain shift. However, most of them based on domain adversarial which is hard to train and converge. In this paper, we propose a novel contrastive learning to improve diversity and discriminability for domain adaptation, dubbed as IDD_ICL, which improve the discriminativeness of the model while increasing the sample diversity. To be precise, we first design a novel implicits contrastive learning loss at sample-level by implicit augment sample of the source domain. While augmenting the diversity of the source domain, we can cluster the samples of the same category in the source domain together, and disperse the samples of different categories, thereby improving the discriminative ability of the model. Furthermore, we show that our algorithm is effective by implicitly learning an infinite number of similar samples. Our results demonstrate that our method doesn’t require complex technologies or specialized equipment, making it readily adoptable and applicable in practical scenarios. |
---|---|
ISSN: | 0924-669X 1573-7497 |
DOI: | 10.1007/s10489-024-05351-y |