Towards More General Loss and Setting in Unsupervised Domain Adaptation

In this article, we present an analysis of unsupervised domain adaptation with a series of theoretical and algorithmic results. We derive a novel Rényi-\alpha α divergence-based generalization bound, which is tailored to domain adaptation algorithms with arbitrary loss functions in a stochastic...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on knowledge and data engineering 2023-10, Vol.35 (10), p.10140-10150
Hauptverfasser: Shui, Changjian, Pu, Ruizhi, Xu, Gezheng, Wen, Jun, Zhou, Fan, Gagne, Christian, Ling, Charles X., Wang, Boyu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this article, we present an analysis of unsupervised domain adaptation with a series of theoretical and algorithmic results. We derive a novel Rényi-\alpha α divergence-based generalization bound, which is tailored to domain adaptation algorithms with arbitrary loss functions in a stochastic setting. Moreover, our theoretical results provide new insights into the assumptions for successful domain adaptation: the closeness between the conditional distributions of the domains and the Lipschitzness on the source domain. With these assumptions, we reveal the following: if their conditional generation distributions are close, the Lipschitzness property of the target domain can be transferred from the Lipschitzness on the source domain, without knowing the exact target distribution. Motivated by our analysis and assumptions, we further derive practical principles for deep domain adaptation: 1) Rényi-2 adversarial training for marginal distributions matching and 2) Lipschitz regularization for the classifier. Our experimental results on both synthetic and real-world datasets support our theoretical findings and the practical efficiency of the proposed principles.
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2023.3266785