Photoplethysmogram-based Cognitive Load Assessment Using Multi-Feature Fusion Model

Cognitive load assessment is crucial for user studies and human--computer interaction designs. As a noninvasive and easy-to-use category of measures, current photoplethysmogram- (PPG) based assessment methods rely on single or small-scale predefined features to recognize responses induced by people’...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ACM transactions on applied perception 2019-09, Vol.16 (4), p.1-17
Hauptverfasser: Zhang, Xiao, Lyu, Yongqiang, Qu, Tong, Qiu, Pengfei, Luo, Xiaomin, Zhang, Jingyu, Fan, Shunjie, Shi, Yuanchun
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Cognitive load assessment is crucial for user studies and human--computer interaction designs. As a noninvasive and easy-to-use category of measures, current photoplethysmogram- (PPG) based assessment methods rely on single or small-scale predefined features to recognize responses induced by people’s cognitive load, which are not stable in assessment accuracy. In this study, we propose a machine-learning method by using 46 kinds of PPG features together to improve the measurement accuracy for cognitive load. We test the method on 16 participants through the classical n-back tasks (0-back, 1-back, and 2-back). The accuracy of the machine-learning method in differentiating different levels of cognitive loads induced by task difficulties can reach 100% in 0-back vs. 2-back tasks, which outperformed the traditional HRV-based and single-PPG-feature-based methods by 12--55%. When using “leave-one-participant-out” subject-independent cross validation, 87.5% binary classification accuracy was reached, which is at the state-of-the-art level. The proposed method can also support real-time cognitive load assessment by beat-to-beat classifications with better performance than the traditional single-feature-based real-time evaluation method.
ISSN:1544-3558
1544-3965
DOI:10.1145/3340962