SNMF-Net: Learning a Deep Alternating Neural Network for Hyperspectral Unmixing

Hyperspectral unmixing is recognized as an important tool to learn the constituent materials and corresponding distribution in a scene. The physical spectral mixture model is always important to tackle this problem because of its highly ill-posed nature. In this article, we introduce a linear spectr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-16
Hauptverfasser: Xiong, Fengchao, Zhou, Jun, Tao, Shuyin, Lu, Jianfeng, Qian, Yuntao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Hyperspectral unmixing is recognized as an important tool to learn the constituent materials and corresponding distribution in a scene. The physical spectral mixture model is always important to tackle this problem because of its highly ill-posed nature. In this article, we introduce a linear spectral mixture model (LMM)-based end-to-end deep neural network named SNMF-Net for hyperspectral unmixing. SNMF-Net shares an alternating architecture and benefits from both model-based methods and learning-based methods. On the one hand, SNMF-Net is of high physical interpretability as it is built by unrolling L_{p} sparsity constrained nonnegative matrix factorization ( L_{p} -NMF) model belonging to LMM families. On the other hand, all the parameters and submodules of SNMF-Net can be seamlessly linked with the alternating optimization algorithm of L_{p} -NMF and unmixing problem. This enables us to reasonably integrate the prior knowledge on unmixing, the optimization algorithm, and the sparse representation theory into the network for robust learning, so as to improve unmixing. Experimental results on the synthetic and real-world data show the advantages of the proposed SNMF-Net over many state-of-the-art methods.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3081177