Detecting Intention Through Motor-Imagery-Triggered Pupil Dilations

Human-computer interaction systems that bypass manual control can be beneficial for many use cases, including users with severe motor disability. We investigated pupillometry (inferring mental activity via dilations of the pupil) as an interaction method because it is noninvasive, easy to analyse, a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Human-computer interaction 2019-01, Vol.34 (1), p.83-113
Hauptverfasser: Rozado, David, Lochner, Martin, Engelke, Ulrich, Dünser, Andreas
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 113
container_issue 1
container_start_page 83
container_title Human-computer interaction
container_volume 34
creator Rozado, David
Lochner, Martin
Engelke, Ulrich
Dünser, Andreas
description Human-computer interaction systems that bypass manual control can be beneficial for many use cases, including users with severe motor disability. We investigated pupillometry (inferring mental activity via dilations of the pupil) as an interaction method because it is noninvasive, easy to analyse, and increasingly available for practical development. In 3 experiments we investigated the efficacy of using pupillometry to detect imaginary motor movements of the hand. In Experiment 1 we demonstrated that, on average, the pupillary response is greater when the participant is imagining a hand-grasping motion, as compared with the control condition. In Experiment 2 we investigated how imaginary hand-grasping affects the pupillary response over time. In Experiment 3 we employed a simple classifier to demonstrate single-trial detection of imagined motor events using pupillometry. Using the mean pupil diameter of a single trial, accuracy rates as high as 71.25%, were achieved. Implications for the development of a pupillometry-based switch and future directions are discussed.
doi_str_mv 10.1080/07370024.2017.1293540
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2135265835</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2135265835</sourcerecordid><originalsourceid>FETCH-LOGICAL-c338t-6911c61bd911713b89415b44a89f042758522350c55100c59ab06e025391335d3</originalsourceid><addsrcrecordid>eNp9kEtPwzAQhC0EEqXwE5AicU7ZtbN53EDlVakIDuVsOYnTukrjYjtC_fckarmyh509fDMrDWO3CDOEHO4hExkAT2YcMJshLwQlcMYmSILHGRCes8nIxCN0ya6838IwRUITNn_SQVfBdOto0QXdBWO7aLVxtl9voncbrIsXO7XW7hCvnFkPh66jz35v2ujJtGrE_TW7aFTr9c1Jp-zr5Xk1f4uXH6-L-eMyroTIQ5wWiFWKZT1ohqLMiwSpTBKVFw0kPKOcOBcEFRHCsAtVQqqBkyhQCKrFlN0dc_fOfvfaB7m1veuGl5KjIJ5SLmig6EhVznrvdCP3zuyUO0gEOfYl__qSY1_y1Nfgezj6TNdYt1M_1rW1DOrQWtc41VXGS_F_xC9Czm67</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2135265835</pqid></control><display><type>article</type><title>Detecting Intention Through Motor-Imagery-Triggered Pupil Dilations</title><source>Taylor &amp; Francis</source><source>Business Source Complete</source><creator>Rozado, David ; Lochner, Martin ; Engelke, Ulrich ; Dünser, Andreas</creator><creatorcontrib>Rozado, David ; Lochner, Martin ; Engelke, Ulrich ; Dünser, Andreas</creatorcontrib><description>Human-computer interaction systems that bypass manual control can be beneficial for many use cases, including users with severe motor disability. We investigated pupillometry (inferring mental activity via dilations of the pupil) as an interaction method because it is noninvasive, easy to analyse, and increasingly available for practical development. In 3 experiments we investigated the efficacy of using pupillometry to detect imaginary motor movements of the hand. In Experiment 1 we demonstrated that, on average, the pupillary response is greater when the participant is imagining a hand-grasping motion, as compared with the control condition. In Experiment 2 we investigated how imaginary hand-grasping affects the pupillary response over time. In Experiment 3 we employed a simple classifier to demonstrate single-trial detection of imagined motor events using pupillometry. Using the mean pupil diameter of a single trial, accuracy rates as high as 71.25%, were achieved. Implications for the development of a pupillometry-based switch and future directions are discussed.</description><identifier>ISSN: 0737-0024</identifier><identifier>EISSN: 1532-7051</identifier><identifier>DOI: 10.1080/07370024.2017.1293540</identifier><language>eng</language><publisher>Hillsdale: Taylor &amp; Francis</publisher><subject>Disability ; Experiments ; Imagery ; Manual control ; Motors ; Pupillometry ; User interface</subject><ispartof>Human-computer interaction, 2019-01, Vol.34 (1), p.83-113</ispartof><rights>Copyright © 2017 Taylor &amp; Francis Group, LLC 2017</rights><rights>Copyright © 2017 Taylor &amp; Francis Group, LLC</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c338t-6911c61bd911713b89415b44a89f042758522350c55100c59ab06e025391335d3</citedby><cites>FETCH-LOGICAL-c338t-6911c61bd911713b89415b44a89f042758522350c55100c59ab06e025391335d3</cites><orcidid>0000-0001-9037-4687</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.tandfonline.com/doi/pdf/10.1080/07370024.2017.1293540$$EPDF$$P50$$Ginformaworld$$H</linktopdf><linktohtml>$$Uhttps://www.tandfonline.com/doi/full/10.1080/07370024.2017.1293540$$EHTML$$P50$$Ginformaworld$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,59620,60409</link.rule.ids></links><search><creatorcontrib>Rozado, David</creatorcontrib><creatorcontrib>Lochner, Martin</creatorcontrib><creatorcontrib>Engelke, Ulrich</creatorcontrib><creatorcontrib>Dünser, Andreas</creatorcontrib><title>Detecting Intention Through Motor-Imagery-Triggered Pupil Dilations</title><title>Human-computer interaction</title><description>Human-computer interaction systems that bypass manual control can be beneficial for many use cases, including users with severe motor disability. We investigated pupillometry (inferring mental activity via dilations of the pupil) as an interaction method because it is noninvasive, easy to analyse, and increasingly available for practical development. In 3 experiments we investigated the efficacy of using pupillometry to detect imaginary motor movements of the hand. In Experiment 1 we demonstrated that, on average, the pupillary response is greater when the participant is imagining a hand-grasping motion, as compared with the control condition. In Experiment 2 we investigated how imaginary hand-grasping affects the pupillary response over time. In Experiment 3 we employed a simple classifier to demonstrate single-trial detection of imagined motor events using pupillometry. Using the mean pupil diameter of a single trial, accuracy rates as high as 71.25%, were achieved. Implications for the development of a pupillometry-based switch and future directions are discussed.</description><subject>Disability</subject><subject>Experiments</subject><subject>Imagery</subject><subject>Manual control</subject><subject>Motors</subject><subject>Pupillometry</subject><subject>User interface</subject><issn>0737-0024</issn><issn>1532-7051</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNp9kEtPwzAQhC0EEqXwE5AicU7ZtbN53EDlVakIDuVsOYnTukrjYjtC_fckarmyh509fDMrDWO3CDOEHO4hExkAT2YcMJshLwQlcMYmSILHGRCes8nIxCN0ya6838IwRUITNn_SQVfBdOto0QXdBWO7aLVxtl9voncbrIsXO7XW7hCvnFkPh66jz35v2ujJtGrE_TW7aFTr9c1Jp-zr5Xk1f4uXH6-L-eMyroTIQ5wWiFWKZT1ohqLMiwSpTBKVFw0kPKOcOBcEFRHCsAtVQqqBkyhQCKrFlN0dc_fOfvfaB7m1veuGl5KjIJ5SLmig6EhVznrvdCP3zuyUO0gEOfYl__qSY1_y1Nfgezj6TNdYt1M_1rW1DOrQWtc41VXGS_F_xC9Czm67</recordid><startdate>20190102</startdate><enddate>20190102</enddate><creator>Rozado, David</creator><creator>Lochner, Martin</creator><creator>Engelke, Ulrich</creator><creator>Dünser, Andreas</creator><general>Taylor &amp; Francis</general><general>Taylor &amp; Francis Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-9037-4687</orcidid></search><sort><creationdate>20190102</creationdate><title>Detecting Intention Through Motor-Imagery-Triggered Pupil Dilations</title><author>Rozado, David ; Lochner, Martin ; Engelke, Ulrich ; Dünser, Andreas</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c338t-6911c61bd911713b89415b44a89f042758522350c55100c59ab06e025391335d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Disability</topic><topic>Experiments</topic><topic>Imagery</topic><topic>Manual control</topic><topic>Motors</topic><topic>Pupillometry</topic><topic>User interface</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Rozado, David</creatorcontrib><creatorcontrib>Lochner, Martin</creatorcontrib><creatorcontrib>Engelke, Ulrich</creatorcontrib><creatorcontrib>Dünser, Andreas</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Human-computer interaction</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Rozado, David</au><au>Lochner, Martin</au><au>Engelke, Ulrich</au><au>Dünser, Andreas</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Detecting Intention Through Motor-Imagery-Triggered Pupil Dilations</atitle><jtitle>Human-computer interaction</jtitle><date>2019-01-02</date><risdate>2019</risdate><volume>34</volume><issue>1</issue><spage>83</spage><epage>113</epage><pages>83-113</pages><issn>0737-0024</issn><eissn>1532-7051</eissn><abstract>Human-computer interaction systems that bypass manual control can be beneficial for many use cases, including users with severe motor disability. We investigated pupillometry (inferring mental activity via dilations of the pupil) as an interaction method because it is noninvasive, easy to analyse, and increasingly available for practical development. In 3 experiments we investigated the efficacy of using pupillometry to detect imaginary motor movements of the hand. In Experiment 1 we demonstrated that, on average, the pupillary response is greater when the participant is imagining a hand-grasping motion, as compared with the control condition. In Experiment 2 we investigated how imaginary hand-grasping affects the pupillary response over time. In Experiment 3 we employed a simple classifier to demonstrate single-trial detection of imagined motor events using pupillometry. Using the mean pupil diameter of a single trial, accuracy rates as high as 71.25%, were achieved. Implications for the development of a pupillometry-based switch and future directions are discussed.</abstract><cop>Hillsdale</cop><pub>Taylor &amp; Francis</pub><doi>10.1080/07370024.2017.1293540</doi><tpages>31</tpages><orcidid>https://orcid.org/0000-0001-9037-4687</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0737-0024
ispartof Human-computer interaction, 2019-01, Vol.34 (1), p.83-113
issn 0737-0024
1532-7051
language eng
recordid cdi_proquest_journals_2135265835
source Taylor & Francis; Business Source Complete
subjects Disability
Experiments
Imagery
Manual control
Motors
Pupillometry
User interface
title Detecting Intention Through Motor-Imagery-Triggered Pupil Dilations
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-05T09%3A35%3A19IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Detecting%20Intention%20Through%20Motor-Imagery-Triggered%20Pupil%20Dilations&rft.jtitle=Human-computer%20interaction&rft.au=Rozado,%20David&rft.date=2019-01-02&rft.volume=34&rft.issue=1&rft.spage=83&rft.epage=113&rft.pages=83-113&rft.issn=0737-0024&rft.eissn=1532-7051&rft_id=info:doi/10.1080/07370024.2017.1293540&rft_dat=%3Cproquest_cross%3E2135265835%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2135265835&rft_id=info:pmid/&rfr_iscdi=true