A Triplet Semisupervised Deep Network for Fusion Classification of Hyperspectral and LiDAR Data

Data fusion of hyperspectral and light detection and ranging (LiDAR) is conducive to obtain more comprehensive surface information and thereby achieve better classification result in Earth monitoring systems. However, lack of labeled samples usually limits the performance of supervised classifiers,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-13
Hauptverfasser: Li, Jiaojiao, Ma, Yinle, Song, Rui, Xi, Bobo, Hong, Danfeng, Du, Qian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Data fusion of hyperspectral and light detection and ranging (LiDAR) is conducive to obtain more comprehensive surface information and thereby achieve better classification result in Earth monitoring systems. However, lack of labeled samples usually limits the performance of supervised classifiers, and the heterogeneity of multisource data also brings great challenges to data fusion. Aiming to address these issues, we propose a triplet semisupervised deep network (TSDN) for fusion classification of hyperspectral and LiDAR. Specifically, we utilize three basic pathways to extract deep learning features: 1-D convolutional neural network (CNN) for spectral features in hyperspectral, 2-D CNN for spatial features in hyperspectral, and Cascade Net for elevation features in LiDAR data. Furthermore, a novel label calibration module (LCM) is proposed to generate effective pseudo labels with high confidence based on the superpixel segmentation by comparing the multiview classification results for assisting semisupervised model training. In addition, we design a novel 3D-Cross Attention Block to enhance the complementary spatial features of multisource data. Experiments on three public HSI–LiDAR benchmarks, Houston, Trento, and MUUFL Gulfport, have demonstrated the effectiveness and superiority of our proposed method.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2022.3213513