Domain Adaptive Urban Garbage Detection Based on Attention and Confidence Fusion

To overcome the challenges posed by limited garbage datasets and the laborious nature of data labeling in urban garbage object detection, we propose an innovative unsupervised domain adaptation approach to detecting garbage objects in urban aerial images. The proposed method leverages a detector, in...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Information (Basel) 2024-11, Vol.15 (11), p.699
Hauptverfasser: Yuan, Tianlong, Lin, Jietao, Hu, Keyong, Chen, Wenqian, Hu, Yifan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To overcome the challenges posed by limited garbage datasets and the laborious nature of data labeling in urban garbage object detection, we propose an innovative unsupervised domain adaptation approach to detecting garbage objects in urban aerial images. The proposed method leverages a detector, initially trained on source domain images, to generate pseudo-labels for target domain images. By employing an attention and confidence fusion strategy, images from both source and target domains can be seamlessly integrated, thereby enabling the detector to incrementally adapt to target domain scenarios while preserving its detection efficacy in the source domain. This approach mitigates the performance degradation caused by domain discrepancies, significantly enhancing the model’s adaptability. The proposed method was validated on a self-constructed urban garbage dataset. Experimental results demonstrate its superior performance over baseline models. Furthermore, we extended the proposed mixing method to other typical scenarios and conducted comprehensive experiments on four well-known public datasets: Cityscapes, KITTI, Sim10k, and Foggy Cityscapes. The result shows that the proposed method exhibits remarkable effectiveness and adaptability across diverse datasets.
ISSN:2078-2489
2078-2489
DOI:10.3390/info15110699