Multiscale Spatial Fusion and Regularization Induced Unsupervised Auxiliary Task CNN Model for Deep Super-Resolution of Hyperspectral Images
Hyperspectral images (HSI) feature rich spectral information in many narrow bands but at a cost of a relatively low spatial resolution. As such, various methods have been developed for enhancing the spatial resolution of the low-resolution HSI (Lr-HSI) by fusing it with high-resolution multispectral...
Gespeichert in:
Veröffentlicht in: | IEEE journal of selected topics in applied earth observations and remote sensing 2022, Vol.15, p.4583-4598 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Hyperspectral images (HSI) feature rich spectral information in many narrow bands but at a cost of a relatively low spatial resolution. As such, various methods have been developed for enhancing the spatial resolution of the low-resolution HSI (Lr-HSI) by fusing it with high-resolution multispectral images (Hr-MSI). The difference in spectrum range and spatial dimensions between the Lr-HSI and Hr-MSI has been fundamental but challenging for multispectral/hyperspectral (MS/HS) fusion. In this article, a multiscale spatial fusion and regularization induced auxiliary task based convolutional neural network model is proposed for deep super-resolution of HSI, where an Lr-HSI is fused with an Hr-MSI to reconstruct a high-resolution HSI (Hr-HSI) counterpart. The multiscale fusion is used to efficiently address the discrepancy in spatial resolutions between the two inputs. Based on the general assumption that the acquired Hr-MSI and the reconstructed Hr-HSI share similar underlying characteristics, the auxiliary task is proposed to learn a representation for improved generality of the model and reduced overfitting. Experimental results on five public datasets have validated the effectiveness of our approach in comparison with several state-of-the-art methods. |
---|---|
ISSN: | 1939-1404 2151-1535 |
DOI: | 10.1109/JSTARS.2022.3176969 |