Attention-Aware Pseudo-3-D Convolutional Neural Network for Hyperspectral Image Classification
Convolutional neural networks (CNNs) have been applied for hyperspectral image classification recently. Among this class of deep models, 3-D CNN has been shown to be more effective by learning discriminative features from abundant spectral signatures and spatial contexts in hyperspectral imagery (HS...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2021-09, Vol.59 (9), p.7790-7802 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Convolutional neural networks (CNNs) have been applied for hyperspectral image classification recently. Among this class of deep models, 3-D CNN has been shown to be more effective by learning discriminative features from abundant spectral signatures and spatial contexts in hyperspectral imagery (HSI). However, by simply imposing 3-D CNN to HSI, a large amount of initial information might be lost in this CNN pipeline. The proposed attention-aware pseudo-3-D (AP3D) convolutional network for HSI classification is motivated by two observations. First, each dimension of the 3-D HSI is not equally important, different attention should be paid to different dimensions of the initial HSI image, especially in the first convolution operation. Second, intermediate representations of the 3-D input image at different stages in the 3-D CNN pipeline represent different levels of features and should not be neglected and abandoned. Instead, a 2-D matrix of scores for each feature map should be fed to the final softmax layer. Quantitative and qualitative results demonstrate that the proposed AP3D model outperforms the state-of-the-art HSI classification methods in agricultural and rural/urban data sets: Indian Pines, Pavia University, and Salinas Scene. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2020.3038212 |