SD-Net: Spatial Dual Network for Aerial Object Detection

The distribution direction of aerial objects is arbitrary compared to objects in natural images. However, the existing detectors identify and locate the targets by relying on the shared features, which leads to the contradiction of regression and classification tasks. To be specific, the classifier...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of the Indian Society of Remote Sensing 2023-10, Vol.51 (10), p.2067-2076
Hauptverfasser: Gao, Yangte, Bi, Fukun, Chen, Liang, Nie, Xiaoyu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The distribution direction of aerial objects is arbitrary compared to objects in natural images. However, the existing detectors identify and locate the targets by relying on the shared features, which leads to the contradiction of regression and classification tasks. To be specific, the classifier suppresses rotation-sensitive features, while the regressor relies on rotation-variable features. To address the above contradictions, a Spatial Dual Network (SD-Net) is proposed, which consists of two modules: Polarization Dual Pyramid Module (PDPM) and Spatial Coordinate Attention Module (SCAM). In the SCAM module, to be able to capture channel-related features and global spatial features in different directions, an attention module is built with different convolution kernels that slide in both horizontal and vertical directions. In addition, the polarization function in the Polarization Dual Pyramid Module can split features into features suitable for classification and regression tasks for use in the classifier and regressor of the network, enabling more refined detection. The experimental results on three remote sensing datasets (i.e., DOTA, UCAS-AOD, and HRSC2016) demonstrate that the proposed method achieves higher performance on detection tasks while maintaining high efficiency.
ISSN:0255-660X
0974-3006
DOI:10.1007/s12524-023-01750-9