HSF-Net: Multiscale Deep Feature Embedding for Ship Detection in Optical Remote Sensing Imagery

Ship detection is an important and challenging task in remote sensing applications. Most methods utilize specially designed hand-crafted features to detect ships, and they usually work well only on one scale, which lack generalization and impractical to identify ships with various scales from multir...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2018-12, Vol.56 (12), p.7147-7161
Hauptverfasser: Li, Qingpeng, Mou, Lichao, Liu, Qingjie, Wang, Yunhong, Zhu, Xiao Xiang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Ship detection is an important and challenging task in remote sensing applications. Most methods utilize specially designed hand-crafted features to detect ships, and they usually work well only on one scale, which lack generalization and impractical to identify ships with various scales from multiresolution images. In this paper, we propose a novel deep feature-based method to detect ships in very high-resolution optical remote sensing images. In our method, a regional proposal network is used to generate ship candidates from feature maps produced by a deep convolutional neural network. To efficiently detect ships with various scales, a hierarchical selective filtering layer is proposed to map features in different scales to the same scale space. The proposed method is an end-to-end network that can detect both inshore and offshore ships ranging from dozens of pixels to thousands. We test our network on a large ship data set which will be released in the future, consisting of Google Earth images, GaoFen-2 images, and unmanned aerial vehicle data. Experiments demonstrate high precision and robustness of our method. Further experiments on aerial images show its good generalization to unseen scenes.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2018.2848901