A Mutual Guide Framework for Training Hyperspectral Image Classifiers With Small Data

This article develops a general yet effective hyperspectral image (HSI) classification framework that is trained with small data. To this end, two identically structured but differently initialized classifiers, which are referred to as two base classifiers, are trained in an iterative manner. Each i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-17
Hauptverfasser: Tai, Xiaoxiao, Li, Mingjie, Xiang, Ming, Ren, Peng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This article develops a general yet effective hyperspectral image (HSI) classification framework that is trained with small data. To this end, two identically structured but differently initialized classifiers, which are referred to as two base classifiers, are trained in an iterative manner. Each iteration consists of three steps, that is: 1) the two base classifiers that are trained separately on guide data; 2) unclassified data that are processed by the two trained base classifiers; and 3) the classification results with high confidence that are explored as new guide data. In the first iteration, the guide data comprising the original small training data are the same for the two base classifiers. From the second iteration, the guide data for the two base classifiers start becoming different. Specifically, in each iteration, the guide data for training one base classifier keep being augmented by high confidence classification results provided by the other base classifier. It is in such an iterative manner that the two classifiers continuously provide different new guide data for each other, and thus increasingly augment labeled data from the original small training set to a reasonably larger amount of samples in a HSI. We refer to such a training strategy as mutual guide. We develop a mutual guide implementation scheme by exploiting extreme learning machines (ELMs) as base classifiers. Extensive experiments on four public HSI datasets, i.e., Indian Pines (IP), Kennedy Space Center (KSC), University of Pavia (UP), and Salinas (SA), validate the classification effectiveness of our mutual guide framework with small training data.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3092351