Automatic Detection of Mind Wandering from Video in the Lab and in the Classroom

We report two studies that used facial features to automatically detect mind wandering, a ubiquitous phenomenon whereby attention drifts from the current task to unrelated thoughts. In a laboratory study, university students (N = 152) (N=152) read a scientific text, whereas in a classroom study hig...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on affective computing 2021-10, Vol.12 (4), p.974-988
Hauptverfasser: Bosch, Nigel, D'Mello, Sidney K.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We report two studies that used facial features to automatically detect mind wandering, a ubiquitous phenomenon whereby attention drifts from the current task to unrelated thoughts. In a laboratory study, university students (N = 152) (N=152) read a scientific text, whereas in a classroom study high school students (N = 135) (N=135) learned biology from an intelligent tutoring system. Mind wandering was measured using validated self-report methods. In the lab, we recorded face videos and analyzed these at six levels of granularity: (1) upper-body movement; (2) head pose; (3) facial textures; (4) facial action units (AUs); (5) co-occurring AUs; and (6) temporal dynamics of AUs. Due to privacy constraints, videos were not recorded in the classroom. Instead, we extracted head pose, AUs, and AU co-occurrences in real-time. Machine learning models, consisting of support vector machines (SVM) and deep neural networks, achieved F_{1} F1 scores of .478 and .414 (25.4 and 20.9 percent above-chance improvements, both with SVMs) for detecting mind wandering in the lab and classroom, respectively. The lab-based detectors achieved 8.4 percent improvement over the previous state-of-the-art; no comparison is available for classroom detectors. We discuss how the detectors can integrate into intelligent interfaces to increase engagement and learning by responding to wandering minds.
ISSN:1949-3045
1949-3045
DOI:10.1109/TAFFC.2019.2908837