Uncertainty-Based Active Learning via Sparse Modeling for Image Classification

Uncertainty sampling-based active learning has been well studied for selecting informative samples to improve the performance of a classifier. In batch-mode active learning, a batch of samples are selected for a query at the same time. The samples with top uncertainty are encouraged to be selected....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing 2019-01, Vol.28 (1), p.316-329
Hauptverfasser: Gaoang Wang, Jenq-Neng Hwang, Rose, Craig, Wallace, Farron
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Uncertainty sampling-based active learning has been well studied for selecting informative samples to improve the performance of a classifier. In batch-mode active learning, a batch of samples are selected for a query at the same time. The samples with top uncertainty are encouraged to be selected. However, this selection strategy ignores the relations among the samples, because the selected samples may have much redundant information with each other. This paper addresses this problem by proposing a novel method that combines uncertainty, diversity, and density via sparse modeling in the sample selection. We use sparse linear combination to represent the uncertainty of unlabeled pool data with Gaussian kernels, in which the diversity and density are well incorporated. The selective sampling method is proposed before optimization to reduce the representation error. To deal with l 0 norm constraint in the sparse problem, two approximated approaches are adopted for efficient optimization. Four image classification data sets are used for evaluation. Extensive experiments related to batch size, feature space, seed size, significant analysis, data transform, and time efficiency demonstrate the advantages of the proposed method.
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2018.2867913