PAN-Guided Cross-Resolution Projection for Local Adaptive Sparse Representation- Based Pansharpening
Sparse representation (SR)-based methods solve pansharpening as an image superresolution problem and receive great popularity. Conventional approaches assume that the high- and low-resolution images have the same sparse coefficients. However, the identity mapping is not universal and also limits the...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2019-07, Vol.57 (7), p.4938-4950 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Sparse representation (SR)-based methods solve pansharpening as an image superresolution problem and receive great popularity. Conventional approaches assume that the high- and low-resolution images have the same sparse coefficients. However, the identity mapping is not universal and also limits the performance. To overcome this limitation, this paper proposes a PAN-guided cross-resolution projection-based pan-sharpening (PGCP-PS) which incorporates the SR image superresolution and details injection pansharpening scheme into a framework. The basic idea of PGCP-PS is to inject a possible offset into the SR superresolution reconstructed part. In addition, the same sparse coefficients assumption across different resolutions is relaxed as the same sparse support with a local adaptive cross-resolution projection. By exploiting the similarity between panchromatic (PAN) and multispectral (MS) images, the cross-resolution projection and offset for sharpening the MS image are estimated from a simulated PAN image superresolution scenario. The high- and low-resolution dictionaries used in the stage of SR image superresolution are learned from PAN image and its degraded version. A series of experimental results on the reduced-scale and full-scale data sets demonstrates that the PGCP-PS outperforms some advanced methods and existing SR-based methods. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2019.2894702 |