Class-aware cross-domain target detection based on cityscape in fog

The semantic segmentation of unsupervised simulation to real-world adjustment (USRA) is designed to improve the training of simulation data in a real-world environment. In practical applications, such as robotic vision and autonomous driving, this could save the cost of manually annotating data. Reg...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Machine vision and applications 2023-11, Vol.34 (6), p.114, Article 114
Hauptverfasser: Gan, Linfeng, Liu, Hu, Chen, Aoran, Xu, Xibin, Zhang, Xuebiao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The semantic segmentation of unsupervised simulation to real-world adjustment (USRA) is designed to improve the training of simulation data in a real-world environment. In practical applications, such as robotic vision and autonomous driving, this could save the cost of manually annotating data. Regular USRA's are often assumed to include large samples of unla-Beled's real-world data for training purposes. However, this assumption is incorrect because of the difficulties of collection and, in practice, data on some practices is still lacking. Therefore, our aim is to reduce the need for large amounts of real data, in the case of unsupervised simulation-real-world domain adaptability (USDA) and generalization (USDG) issues, which only exist in the real world. In order to make up for the limited actual data, this paper first constructs a pseudo-target domain, using a real data to achieve the simulation data style. Based on this method, this paper proposes a cross-domain interdomain randomization method based on class perception to extract domain invariant knowledge from simulated objects and virtual objects. We will demonstrate the effectiveness of our approach in USDA and USDG, such as Cityscapes and Foggy Cityscapes, which are far superior to existing technological means.
ISSN:0932-8092
1432-1769
DOI:10.1007/s00138-023-01463-6