A nearest neighbor-based active learning method and its application to time series classification

•Locally estimated prediction uncertainty and sampling utility metrics are introduced for active learning.•An effective batch-mode active learning algorithm is introduced based on the proposed metrics.•Experimental results on two time series data demonstrate the effectiveness of the proposed method....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition letters 2021-06, Vol.146, p.230-236
Hauptverfasser: Gweon, Hyukjun, Yu, Hao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Locally estimated prediction uncertainty and sampling utility metrics are introduced for active learning.•An effective batch-mode active learning algorithm is introduced based on the proposed metrics.•Experimental results on two time series data demonstrate the effectiveness of the proposed method. Although the one nearest neighbor approach is widely used in time series classification, its successful performance requires enough labeled data, which is often difficult to obtain due to a high labeling cost. This article considers a practical classification scenario in which labeled data are scant but unlabeled data are plenty, and a limited budget for the annotating task is provided. For an effective classification with limited resources, we propose a nearest neighbor-based sampling strategy for active learning. The proposed approach uses highly local information to measure the uncertainty and utility of an unlabeled instance and is applicable to extremely sparse labeled data. Furthermore, we extend the proposed approach to batch mode active learning to select a batch of informative samples at each sampling iteration. Experimental results on the WAFER and ECG5000 data sets demonstrate the effectiveness of the proposed algorithm as compared with other nearest neighbor-based approaches.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2021.03.016