Hyperspectral Image Denoising Via Robust Subspace Estimation and Group Sparsity Constraint
Hyperspectral cameras capture electromagnetic information within hundreds of narrow spectral bands, producing hyperspectral images (HSIs) with the capability to accurately characterize the attribute information of objects. However, mixed noise induced by instrument and atmospheric effects hinders th...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2023-01, Vol.61, p.1-1 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Hyperspectral cameras capture electromagnetic information within hundreds of narrow spectral bands, producing hyperspectral images (HSIs) with the capability to accurately characterize the attribute information of objects. However, mixed noise induced by instrument and atmospheric effects hinders the interpretations and applications of the HSIs. In this paper, we propose a novel subspace representation based mixed noise removal method for hyperspectral images via Robust Subspace Estimation and weighted Group Sparsity constraint (RoSEGS). An outlier detection method is proposed to effectively detect sparse noise and replace the sparse noise with new estimates. A subspace estimation strategy, which is robust to mixed noise, is proposed. The subspace is first estimated after sparse noise detection and then optimized iteratively. In addition to the introduction of a state-of-the-art denoiser based on the plug-and-play technique to exploit self-similarity characteristics of the eigen-images, we impose a weighted group sparse regularization on the eigen-images to better promote the group sparsity of the spatial differences between the eigen-images, which further improves the denoising performance. We performed extensive experiments on two simulated and two real HSIs to fully demonstrate the effectiveness of the proposed method in comparison with seven state-of-the-art competitors. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2023.3277832 |