Multispectral and Hyperspectral Image Fusion Based on Regularized Coupled Non-Negative Block-Term Tensor Decomposition

The problem of multispectral and hyperspectral image fusion (MHF) is to reconstruct images by fusing the spatial information of multispectral images and the spectral information of hyperspectral images. Focusing on the problem that the hyperspectral canonical polyadic decomposition model and the Tuc...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Remote sensing (Basel, Switzerland) Switzerland), 2022-11, Vol.14 (21), p.5306
Hauptverfasser: Guo, Hao, Bao, Wenxing, Qu, Kewen, Ma, Xuan, Cao, Meng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The problem of multispectral and hyperspectral image fusion (MHF) is to reconstruct images by fusing the spatial information of multispectral images and the spectral information of hyperspectral images. Focusing on the problem that the hyperspectral canonical polyadic decomposition model and the Tucker model cannot introduce the physical interpretation of the latent factors into the framework, it is difficult to use the known properties and abundance of endmembers to generate high-quality fusion images. This paper proposes a new fusion algorithm. In this paper, a coupled non-negative block-term tensor model is used to estimate the ideal high spatial resolution hyperspectral images, its sparsity is characterized by adding 1-norm, and total variation (TV) is introduced to describe piecewise smoothness. Secondly, the different operators in two directions are defined and introduced to characterize their piecewise smoothness. Finally, the proximal alternating optimization (PAO) algorithm and the alternating multiplier method (ADMM) are used to iteratively solve the model. Experiments on two standard datasets and two local datasets show that the performance of this method is better than the state-of-the-art methods.
ISSN:2072-4292
2072-4292
DOI:10.3390/rs14215306