Multiscale Superpixel-Based Hyperspectral Image Classification Using Recurrent Neural Networks With Stacked Autoencoders
This paper develops a novel hyperspectral image (HSI) classification framework by exploiting the spectral-spatial features of multiscale superpixels via recurrent neural networks with stacked autoencoders. The superpixels can be used to segment an HSI into shape-adaptive regions, and multiscale supe...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on multimedia 2020-02, Vol.22 (2), p.487-501 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper develops a novel hyperspectral image (HSI) classification framework by exploiting the spectral-spatial features of multiscale superpixels via recurrent neural networks with stacked autoencoders. The superpixels can be used to segment an HSI into shape-adaptive regions, and multiscale superpixels can capture the object information more accurately. Therefore, the superpixel-based classification methods have been studied by many researchers. In this paper, we propose a multiscale superpixel-based classification method. In contrast to current research, the proposed method not only captures the features of each scale but also considers the correlation among different scales via recurrent neural networks. In this way, the spectral-spatial information within a superpixel is more efficiently exploited. In this paper, we first segment the HSI from coarse to fine scales using the superpixels. Then, the spatial features within each superpixel and among superpixels are sufficiently exploited by the local and nonlocal similarity measure. Finally, recurrent neural networks with stacked autoencoders are proposed to learn the high-level multiscale spectral-spatial features. Experiments are conducted on real HSI datasets. The results demonstrate the superiority of the proposed method over several well-known methods in both visual appearance and classification accuracy. |
---|---|
ISSN: | 1520-9210 1941-0077 |
DOI: | 10.1109/TMM.2019.2928491 |