Hyperspectral image classification with deep 3D capsule network and Markov random field

To address the existing problems of capsule networks in deep feature extraction and spatial‐spectral feature fusion of hyperspectral images, this paper proposes a hyperspectral image classification method that combines a deep residual 3D capsule network and Markov random field. Based on this method,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET Image Processing 2022-01, Vol.16 (1), p.79-91
Hauptverfasser: Tan, Xiong, Xue, Zhixiang, Yu, Xuchu, Sun, Yifan, Gao, Kuiliang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To address the existing problems of capsule networks in deep feature extraction and spatial‐spectral feature fusion of hyperspectral images, this paper proposes a hyperspectral image classification method that combines a deep residual 3D capsule network and Markov random field. Based on this method, the deep spatial‐spectral features of hyperspectral images are extracted using the deep residual 3D convolutional structure, the vector capsules of the features are obtained by the initial capsule layer and mapped into probability capsules via the 3D dynamic routing mechanism to construct the classification probability map, and the spatial structure of the classification results is regularised by the Markov random field to further improve the classification accuracy and performance of the images. Two sets of benchmark hyperspectral images, namely Indian Pines and Pavia University data sets, were used to conduct comparative experiments and ablation study. The experimental results showed that, compared with the conventional convolutional neural network and existing capsule network models, the proposed method not only improves the classification accuracy of the images but also partly eliminates the category noise and affords a more regular classification probability map.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12330