FDLR-Net: A feature decoupling and localization refinement network for object detection in remote sensing images
Object detection in remote sensing images is a critical task in computer vision. Often times in remote sensing images, objects are highly variable in scale and have arbitrary orientation, which renders spatial alignment between anchor boxes and objects challenging in the object detection task. In th...
Gespeichert in:
Veröffentlicht in: | Expert systems with applications 2023-09, Vol.225, p.120068, Article 120068 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Object detection in remote sensing images is a critical task in computer vision. Often times in remote sensing images, objects are highly variable in scale and have arbitrary orientation, which renders spatial alignment between anchor boxes and objects challenging in the object detection task. In this paper, a feature decoupling and localization refinement network is suggested as a solution to this issue. Specifically, a bidirectional feature fusion module (BFFM) is devised to construct a multi-scale feature pyramid for detecting objects at different scales. A feature decoupling module (FDM) is devised which utilizes the fusion of spatial attention and channel attention, as well as different attention functions to generate features specifically tuned for regression and classification, that are used to guide more accurate localization and classification. Further, a localization refinement module (LRM) is designed to automatically optimize the anchor box parameters to achieve spatial alignment of the anchor box and the object regression feature. In this way, the FDM and LRM are cascaded to achieve more accurate localization. Experimental results on two open access datasets, DOTA and HRSC2016, show that the performance of FDLR-Net is state-of-the-art, with mAP reaching 73.08% and 89.4%, respectively. |
---|---|
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2023.120068 |