Upcycling Models under Domain and Category Shift
Deep neural networks (DNNs) often perform poorly in the presence of domain shift and category shift. How to upcycle DNNs and adapt them to the target task remains an important open problem. Unsupervised Domain Adaptation (UDA), especially recently proposed Source-free Domain Adaptation (SFDA), has b...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deep neural networks (DNNs) often perform poorly in the presence of domain
shift and category shift. How to upcycle DNNs and adapt them to the target task
remains an important open problem. Unsupervised Domain Adaptation (UDA),
especially recently proposed Source-free Domain Adaptation (SFDA), has become a
promising technology to address this issue. Nevertheless, existing SFDA methods
require that the source domain and target domain share the same label space,
consequently being only applicable to the vanilla closed-set setting. In this
paper, we take one step further and explore the Source-free Universal Domain
Adaptation (SF-UniDA). The goal is to identify "known" data samples under both
domain and category shift, and reject those "unknown" data samples (not present
in source classes), with only the knowledge from standard pre-trained source
model. To this end, we introduce an innovative global and local clustering
learning technique (GLC). Specifically, we design a novel, adaptive one-vs-all
global clustering algorithm to achieve the distinction across different target
classes and introduce a local k-NN clustering strategy to alleviate negative
transfer. We examine the superiority of our GLC on multiple benchmarks with
different category shift scenarios, including partial-set, open-set, and
open-partial-set DA. Remarkably, in the most challenging open-partial-set DA
scenario, GLC outperforms UMAD by 14.8\% on the VisDA benchmark. The code is
available at https://github.com/ispc-lab/GLC. |
---|---|
DOI: | 10.48550/arxiv.2303.07110 |