Multi-Path Interactive Network for Aircraft Identification with Optical and SAR Images

Aircraft identification has been a research hotspot in remote-sensing fields. However, due to the presence of clouds in satellite-borne optical imagery, it is difficult to identify aircraft using a single optical image. In this paper, a Multi-path Interactive Network (MIN) is proposed to fuse Optica...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Remote sensing (Basel, Switzerland) Switzerland), 2022-08, Vol.14 (16), p.3922
Hauptverfasser: Gao, Quanwei, Feng, Zhixi, Yang, Shuyuan, Chang, Zhihao, Wang, Ruyu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Aircraft identification has been a research hotspot in remote-sensing fields. However, due to the presence of clouds in satellite-borne optical imagery, it is difficult to identify aircraft using a single optical image. In this paper, a Multi-path Interactive Network (MIN) is proposed to fuse Optical and Synthetic Aperture Radar (SAR) images for aircraft identification on cloudy days. First, features are extracted from optical and SAR images separately by convolution backbones of ResNet-34. Second, a piecewise residual fusion strategy is proposed to reduce the effect of clouds. A plug-and-play Interactive Attention Sum-Max fusion module (IASM), is thus constructed to interact with features from multi-modal images. Moreover, multi-path IASM is designed to mix multi-modal features from backbones. Finally, the fused features are sent to the neck and head of MIN for regression and classification. Extensive experiments are carried out on the Fused Cloudy Aircraft Detection (FCAD) dataset that is constructed, and the results show the efficiency of MIN in identifying aircraft under clouds with different thicknesses.Compared with the single-source model, the multi-source fusion model MIN is improved by more than 20%, and the proposed method outperforms the state-of-the-art approaches.
ISSN:2072-4292
2072-4292
DOI:10.3390/rs14163922