Feature Extraction for Classification of Hyperspectral and LiDAR Data Using Patch-to-Patch CNN
Multisensor fusion is of great importance in Earth observation related applications. For instance, hyperspectral images (HSIs) provide wealthy spectral information while light detection and ranging (LiDAR) data provide elevation information, and using HSI and LiDAR data together can achieve better c...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on cybernetics 2020-01, Vol.50 (1), p.100-111 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Multisensor fusion is of great importance in Earth observation related applications. For instance, hyperspectral images (HSIs) provide wealthy spectral information while light detection and ranging (LiDAR) data provide elevation information, and using HSI and LiDAR data together can achieve better classification performance. In this paper, an unsupervised feature extraction framework, named as patch-to-patch convolutional neural network (PToP CNN), is proposed for collaborative classification of hyperspectral and LiDAR data. More specific, a three-tower PToP mapping is first developed to seek an accurate representation from HSI to LiDAR data, aiming at merging multiscale features between two different sources. Then, by integrating hidden layers of the designed PToP CNN, extracted features are expected to possess deeply fused characteristics. Accordingly, features from different hidden layers are concatenated into a stacked vector and fed into three fully connected layers. To verify the effectiveness of the proposed classification framework, experiments are executed on two benchmark remote sensing data sets. The experimental results demonstrate that the proposed method provides superior performance when compared with some state-of-the-art classifiers, such as two-branch CNN and context CNN. |
---|---|
ISSN: | 2168-2267 2168-2275 |
DOI: | 10.1109/TCYB.2018.2864670 |