Detecting Intention Through Motor-Imagery-Triggered Pupil Dilations
Human-computer interaction systems that bypass manual control can be beneficial for many use cases, including users with severe motor disability. We investigated pupillometry (inferring mental activity via dilations of the pupil) as an interaction method because it is noninvasive, easy to analyse, a...
Gespeichert in:
Veröffentlicht in: | Human-computer interaction 2019-01, Vol.34 (1), p.83-113 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Human-computer interaction systems that bypass manual control can be beneficial for many use cases, including users with severe motor disability. We investigated pupillometry (inferring mental activity via dilations of the pupil) as an interaction method because it is noninvasive, easy to analyse, and increasingly available for practical development. In 3 experiments we investigated the efficacy of using pupillometry to detect imaginary motor movements of the hand. In Experiment 1 we demonstrated that, on average, the pupillary response is greater when the participant is imagining a hand-grasping motion, as compared with the control condition. In Experiment 2 we investigated how imaginary hand-grasping affects the pupillary response over time. In Experiment 3 we employed a simple classifier to demonstrate single-trial detection of imagined motor events using pupillometry. Using the mean pupil diameter of a single trial, accuracy rates as high as 71.25%, were achieved. Implications for the development of a pupillometry-based switch and future directions are discussed. |
---|---|
ISSN: | 0737-0024 1532-7051 |
DOI: | 10.1080/07370024.2017.1293540 |