Aggregating Features From Dual Paths for Remote Sensing Image Scene Classification

Scene classification is an important and challenging task employed toward understanding remote sensing images. Convolutional neural networks have been widely applied in remote sensing scene classification in recent years, boosting classification accuracy. However, with improvements in resolution, th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, Vol.10, p.16740-16755
Hauptverfasser: Yu, Donghang, Xu, Qing, Guo, Haitao, Lu, Jun, Lin, Yuzhun, Liu, Xiangyun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Scene classification is an important and challenging task employed toward understanding remote sensing images. Convolutional neural networks have been widely applied in remote sensing scene classification in recent years, boosting classification accuracy. However, with improvements in resolution, the categories of remote sensing images have become ever more fine-grained. The high intraclass diversity and interclass similarity are the main characteristics that differentiate remote scene image classification from natural image classification. To extract discriminative representation from images, we propose an end-to-end feature fusion method that aggregates features from dual paths (AFDP). First, lightweight convolutional neural networks with fewer parameters and calculations are used to construct a feature extractor with dual branches. Then, in the feature fusion stage, a novel feature fusion method that integrates the concepts of bilinear pooling and feature connection is adopted to learn discriminative features from images. The AFDP method was evaluated on three public remote sensing image benchmarks. The experimental results indicate that the AFDP method outperforms current state-of-the-art methods, with advantages of simple form, strong versatility, fewer parameters, and less calculation.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3147543