An efficient hyperspectral image classification method for limited training data

Hyperspectral image classification has gained great progress in recent years based on deep learning model and massive training data. However, it is expensive and unpractical to label hyperspectral image data and implement model in constrained environment. To address this problem, this paper proposes...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET Image Processing 2023-05, Vol.17 (6), p.1709-1717
Hauptverfasser: Ren, Yitao, Jin, Peiyang, Li, Yiyang, Mao, Keming
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Hyperspectral image classification has gained great progress in recent years based on deep learning model and massive training data. However, it is expensive and unpractical to label hyperspectral image data and implement model in constrained environment. To address this problem, this paper proposes an effective ghost module based spectral network for hyperspectral image classification. First, Ghost3D module is adopted to reduce the size of model parameter dramatically by redundant feature maps generation with linear transformation. Then Ghost2D module with channel‐wise attention is used to explore informative spectral feature representation. For large field covering, the non‐local operation is utilized to promote self‐attention. Compared with the state‐of‐the‐art hyperspectral image classification methods, the proposed approach achieves superior performance on three hyperspectral image data sets with fewer sample labelling and less resource consumption. First, Ghost3D module is adopted to reduce the size of model parameter dramatically by redundant feature maps generation with linear transformation. Then Ghost2D module with channel‐wise attention is used to extract informative spectral representation. For large field covering, non‐local operation is utilised finally to promote self‐attention.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12749