Unsupervised Cluster Guided Object Detection in Aerial Images

Object detection from high-resolution aerial images has received increasing attention during the last few years. It is a common practice to downsize images before feeding them into a network. In real life, there are lots of scenes where many objects gather together in certain areas, such as crossroa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of selected topics in applied earth observations and remote sensing 2021, Vol.14, p.11204-11216
Hauptverfasser: Liao, Jiajia, Piao, Yingchao, Su, Jinhe, Cai, Guorong, Huang, Xingwang, Chen, Long, Huang, Zhaohong, Wu, Yundong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Object detection from high-resolution aerial images has received increasing attention during the last few years. It is a common practice to downsize images before feeding them into a network. In real life, there are lots of scenes where many objects gather together in certain areas, such as crossroads, parking lots, and playgrounds. The downsizing operation significantly limits the detection ability in these scenes. In this article, we proposed an unsupervised cluster guided detection framework (UCGNet) to address these issues by guiding the detector focus on the object-densely distributed area. In particular, a local location module is first applied to predict a binary map presenting how objects distribute in terms of the pixel of the map. Then, an unsupervised cluster method is used to produce dense regions. Each adjusted dense region is fed into the detector for object detection. Finally, a global merge module generates the final predict results. Experiments were conducted on two popular aerial image datasets including VisDrone2019 and UAVDT. In both datasets, our proposed method outperforms the existing baseline methods with achieving 32.8% and 19.1% mAP, respectively.
ISSN:1939-1404
2151-1535
DOI:10.1109/JSTARS.2021.3122152