Attention and Feature Fusion SSD for Remote Sensing Object Detection
Object detection is a basic topic in the field of remote sensing. However, remote sensing images usually suffer in complicated backgrounds, object scale variations, and small objects, which make remote sensing object detection still difficult. Although the existing two-stage methods have higher accu...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on instrumentation and measurement 2021, Vol.70, p.1-9 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Object detection is a basic topic in the field of remote sensing. However, remote sensing images usually suffer in complicated backgrounds, object scale variations, and small objects, which make remote sensing object detection still difficult. Although the existing two-stage methods have higher accuracy, their detection speed is slow. While the one-stage object detection algorithms such as single-shot detector (SSD) and YOLO, although they can achieve real-time detection, they have poor detection performance, especially for small objects. In this article, in order to further improve the remote sensing object detection performance of one-stage methods, we propose an end-to-end network named attention and feature fusion SSD. First, a multilayer feature fusion structure is designed to enhance the semantic information of the shallow features. Next, a dual-path attention module is introduced to screen the feature information. This module uses spatial attention and channel attention to suppress the background noise and highlight the key feature. Then, the feature representation ability of the network is further enhanced by a multiscale receptive field module. Finally, the loss function is optimized to alleviate the imbalance between the positive and negative samples. The experimental results on the DOTA and NWPU VHR-10 data sets verify the effectiveness of our method. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2021.3052575 |