Cross Domain Object Detection by Target-Perceived Dual Branch Distillation
Cross domain object detection is a realistic and challenging task in the wild. It suffers from performance degradation due to large shift of data distributions and lack of instance-level annotations in the target domain. Existing approaches mainly focus on either of these two difficulties, even thou...
Gespeichert in:
Hauptverfasser: | , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Cross domain object detection is a realistic and challenging task in the
wild. It suffers from performance degradation due to large shift of data
distributions and lack of instance-level annotations in the target domain.
Existing approaches mainly focus on either of these two difficulties, even
though they are closely coupled in cross domain object detection. To solve this
problem, we propose a novel Target-perceived Dual-branch Distillation (TDD)
framework. By integrating detection branches of both source and target domains
in a unified teacher-student learning scheme, it can reduce domain shift and
generate reliable supervision effectively. In particular, we first introduce a
distinct Target Proposal Perceiver between two domains. It can adaptively
enhance source detector to perceive objects in a target image, by leveraging
target proposal contexts from iterative cross-attention. Afterwards, we design
a concise Dual Branch Self Distillation strategy for model training, which can
progressively integrate complementary object knowledge from different domains
via self-distillation in two branches. Finally, we conduct extensive
experiments on a number of widely-used scenarios in cross domain object
detection. The results show that our TDD significantly outperforms the
state-of-the-art methods on all the benchmarks. Our code and model will be
available at https://github.com/Feobi1999/TDD. |
---|---|
DOI: | 10.48550/arxiv.2205.01291 |