Tackling Micro-Expression Data Shortage via Dataset Alignment and Active Learning

The research on micro-expression recognition has been drawing great attention in recent years, because of its great potential in the lie detection, clinical diagnosis, and national security. Amongst many challenges, data shortage stands out as it directly prevents an accurate training of micro-expre...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2023, Vol.25, p.5429-5443
Hauptverfasser: Ben, Xianye, Gong, Chen, Huang, Tianhuan, Li, Chuanye, Yan, Rui, Li, Yujun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The research on micro-expression recognition has been drawing great attention in recent years, because of its great potential in the lie detection, clinical diagnosis, and national security. Amongst many challenges, data shortage stands out as it directly prevents an accurate training of micro-expression recognition algorithm. In this work, we present our approach within a dataset alignment and active learning (DAAL) framework. DAAL effectively queries minimum examples to label, as well as transfers features from micro-expression dataset to macro-expression dataset. Specifically, the features from micro-expression dataset are mapped to the macro-expression dataset with a translator, so that the classifier trained in macro-expression dataset can be adjusted and adapted to boost the classification performance on the micro-expression dataset. Besides, the most informative examples in the micro-expression dataset are selected through active learning in an iterative way, which effectively improves the classification ability of the model. Comprehensive experiments on CASME, CASME II, SAMM and SMIC databases firmly demonstrate that the proposed DAAL outperforms previous works by a large margin on micro-expression recognition task.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2022.3192727