Matrix Factorization With Framelet and Saliency Priors for Hyperspectral Anomaly Detection

Hyperspectral anomaly detection aims to separate sparse anomalies from low-rank background components. A variety of detectors have been proposed to identify anomalies, but most of them tend to emphasize characterizing backgrounds with multiple types of prior knowledge and limited information on anom...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2023, Vol.61, p.1-13
Hauptverfasser: Shen, Xiangfei, Liu, Haijun, Nie, Jing, Zhou, Xichuan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Hyperspectral anomaly detection aims to separate sparse anomalies from low-rank background components. A variety of detectors have been proposed to identify anomalies, but most of them tend to emphasize characterizing backgrounds with multiple types of prior knowledge and limited information on anomaly components. To tackle these issues, this article simultaneously focuses on two components and proposes a matrix factorization method with framelet and saliency priors to handle the anomaly detection problem. We first employ a framelet to characterize nonnegative background representation coefficients, as they can jointly maintain sparsity and piecewise smoothness after framelet decomposition. We then exploit saliency prior knowledge to measure each pixel's potential to be an anomaly. Finally, we incorporate the pure pixel index (PPI) with Reed-Xiaoli's (RX) method to possess representative dictionary atoms. We solve the optimization problem using a block successive upper-bound minimization (BSUM) framework with guaranteed convergence. Experiments conducted on benchmark hyperspectral datasets demonstrate that the proposed method outperforms some state-of-the-art anomaly detection methods.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2023.3248599