Supervised Deep Feature Extraction for Hyperspectral Image Classification
Hyperspectral image classification has become a research focus in recent literature. However, well-designed features are still open issues that impact on the performance of classifiers. In this paper, a novel supervised deep feature extraction method based on siamese convolutional neural network (S-...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2018-04, Vol.56 (4), p.1909-1921 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Hyperspectral image classification has become a research focus in recent literature. However, well-designed features are still open issues that impact on the performance of classifiers. In this paper, a novel supervised deep feature extraction method based on siamese convolutional neural network (S-CNN) is proposed to improve the performance of hyperspectral image classification. First, a CNN with five layers is designed to directly extract deep features from hyperspectral cube, where the CNN can be intended as a nonlinear transformation function. Then, the siamese network composed by two CNNs is trained to learn features that show a low intraclass and high interclass variability. The important characteristic of the presented approach is that the S-CNN is supervised with a margin ranking loss function, which can extract more discriminative features for classification tasks. To demonstrate the effectiveness of the proposed feature extraction method, the features extracted from three widely used hyperspectral data sets are fed into a linear support vector machine (SVM) classifier. The experimental results demonstrate that the proposed feature extraction method in conjunction with a linear SVM classifier can obtain better classification performance than that of the conventional methods. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2017.2769673 |