Open Set Domain Adaptation via Joint Alignment and Category Separation
Prevalent domain adaptation approaches are suitable for a close-set scenario where the source domain and the target domain are assumed to share the same data categories. However, this assumption is often violated in real-world conditions where the target domain usually contains samples of categories...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2023-09, Vol.34 (9), p.6186-6199 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Prevalent domain adaptation approaches are suitable for a close-set scenario where the source domain and the target domain are assumed to share the same data categories. However, this assumption is often violated in real-world conditions where the target domain usually contains samples of categories that are not presented in the source domain. This setting is termed as open set domain adaptation (OSDA). Most existing domain adaptation approaches do not work well in this situation. In this article, we propose an effective method, named joint alignment and category separation (JACS), for OSDA. Specifically, JACS learns a latent shared space, where the marginal and conditional divergence of feature distributions for commonly known classes across domains is alleviated (Joint Alignment), the distribution discrepancy between the known classes and the unknown class is enlarged, and the distance between different known classes is also maximized (Category Separation). These two aspects are unified into an objective to reinforce the optimization of each part simultaneously. The classifier is achieved based on the learned new feature representations by minimizing the structural risk in the reproducing kernel Hilbert space. Extensive experiment results verify that our method outperforms other state-of-the-art approaches on several benchmark datasets. |
---|---|
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2021.3134673 |