Remote Sensing Image Sharpening by Integrating Multispectral Image Super-Resolution and Convolutional Sparse Representation Fusion

In remote sensing, it is quite necessary to fuse spectral information of low-resolution multispectral (LRMS) images and spatial information of panchromatic (PAN) images for obtaining high-resolution multispectral (HRMS) images. In this paper, an effective fusion method integrating multispectral (MS)...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019, Vol.7, p.46562-46574
Hauptverfasser: Wu, Honglin, Zhao, Shuzhen, Zhang, Jianming, Lu, Chaoquan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In remote sensing, it is quite necessary to fuse spectral information of low-resolution multispectral (LRMS) images and spatial information of panchromatic (PAN) images for obtaining high-resolution multispectral (HRMS) images. In this paper, an effective fusion method integrating multispectral (MS) image super-resolution and convolutional sparse representation (CSR) fusion is proposed to make full use of the spatial information of remote sensing images. First, for enhancing the spatial information of LRMS images with suitable sizes, a fast iterative image super-resolution algorithm based on the learned iterative shrinkage and thresholding algorithm (LISTA) is exploited in the first stage. It employs a feed-forward neural network to simplify the solution of sparse coefficients in the process of super-resolution. In the fusion stage, we propose a CSR-based image fusion framework, in which each MS super-resolution image and PAN image is decomposed into a basic layer and a detail layer, then we fuse the basic layers and the detail layers of the images, respectively. This hierarchical fusion strategy guarantees great performance in detail preservation. The experimental results on QuickBird, WorldView-2, and Landsat ETM+ datasets demonstrate that the proposed method outperforms other methods in terms of both objective evaluation and visual effect.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2908968