Cross-Attention Spectral-Spatial Network for Hyperspectral Image Classification

Hyperspectral image (HSI) classification aims to identify categories of hyperspectral pixels. Recently, many convolutional neural networks (CNNs) have been designed to explore the spectrums and spatial information of HSI for classification. In recent CNN-based methods, 2-D or 3-D convolutions are in...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-14
Hauptverfasser: Yang, Kai, Sun, Hao, Zou, Chunbo, Lu, Xiaoqiang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Hyperspectral image (HSI) classification aims to identify categories of hyperspectral pixels. Recently, many convolutional neural networks (CNNs) have been designed to explore the spectrums and spatial information of HSI for classification. In recent CNN-based methods, 2-D or 3-D convolutions are inevitably utilized as basic operations to extract the spatial or spectral-spatial features. However, 2-D and 3-D convolutions are sensitive to the image rotation, which may result in that recent CNN-based methods are not robust to the HSI rotation. In this article, a cross-attention spectral-spatial network (CASSN) is proposed to alleviate the problem of HSI rotation. First, a cross-spectral attention component is proposed to exploit the local and global spectrums of the pixel to generate band weight for suppressing redundant bands. Second, a spectral feature extraction component is utilized to capture spectral features. Then, a cross-spatial attention component is proposed to generate spectral-spatial features from the HSI patch under the guidance of the pixel to be classified. Finally, the spectral-spatial feature is fed to a softmax classifier to obtain the category. The effectiveness of CASSN is demonstrated on three public databases.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3133582