Automated gaze-based mind wandering detection during computerized learning in classrooms
We investigate the use of commercial off-the-shelf (COTS) eye-trackers to automatically detect mind wandering—a phenomenon involving a shift in attention from task-related to task-unrelated thoughts—during computerized learning. Study 1 ( N = 135 high-school students) tested the feasibility of COTS...
Gespeichert in:
Veröffentlicht in: | User modeling and user-adapted interaction 2019-09, Vol.29 (4), p.821-867 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 867 |
---|---|
container_issue | 4 |
container_start_page | 821 |
container_title | User modeling and user-adapted interaction |
container_volume | 29 |
creator | Hutt, Stephen Krasich, Kristina Mills, Caitlin Bosch, Nigel White, Shelby Brockmole, James R. D’Mello, Sidney K. |
description | We investigate the use of commercial off-the-shelf (COTS) eye-trackers to automatically detect mind wandering—a phenomenon involving a shift in attention from task-related to task-unrelated thoughts—during computerized learning. Study 1 (
N
= 135 high-school students) tested the feasibility of COTS eye tracking while students learn biology with an intelligent tutoring system called GuruTutor in their classroom. We could successfully track eye gaze in 75% (both eyes tracked) and 95% (one eye tracked) of the cases for 85% of the sessions where gaze was successfully recorded. In Study 2, we used this data to build automated student-independent detectors of mind wandering, obtaining accuracies (mind wandering F
1
= 0.59) substantially better than chance (F
1
= 0.24). Study 3 investigated context-generalizability of mind wandering detectors, finding that models trained on data collected in a controlled laboratory more successfully generalized to the classroom than the reverse. Study 4 investigated gaze- and video- based mind wandering detection, finding that gaze-based detection was superior and multimodal detection yielded an improvement in limited circumstances. We tested live mind wandering detection on a new sample of 39 students in Study 5 and found that detection accuracy (mind wandering F
1
= 0.40) was considerably above chance (F1 = 0.24), albeit lower than offline detection accuracy from Study 1 (F
1
= 0.59), a finding attributable to handling of missing data. We discuss our next steps towards developing gaze-based attention-aware learning technologies to increase engagement and learning by combating mind wandering in classroom contexts. |
doi_str_mv | 10.1007/s11257-019-09228-5 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2232985074</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2232985074</sourcerecordid><originalsourceid>FETCH-LOGICAL-c429t-819e52d0ee5e1abe00f15f1fc559ae7e12eb1437bc77adcd0f7a97549497d7203</originalsourceid><addsrcrecordid>eNp9UMFKxDAQDaLguvoDngqeo5m02TTHZVFXWPCi4C2kyXTpsm1q0iLu15vdCt48zfDmvTczj5BbYPfAmHyIAFxIykBRpjgvqTgjMxAyp5ArOCezhBYUykV5Sa5i3LEkWkg1Ix_LcfCtGdBlW3NAWpmY2rbpXPZlOoeh6baZwwHt0Pguc-MJsL7txyEND4m8RxO6I9p0md2bGIP3bbwmF7XZR7z5rXPy_vT4tlrTzevzy2q5obbgaqAlKBTcMUSBYCpkrAZRQ22FUAYlAscKilxWVkrjrGO1NEqKQhVKOslZPid3k28f_OeIcdA7P4YurdSc51yVgskisfjEssGnA7HWfWhaE741MH1MUE8J6pSgPiWoRRLlkyj2x68x_Fn_o_oBuL11Tw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2232985074</pqid></control><display><type>article</type><title>Automated gaze-based mind wandering detection during computerized learning in classrooms</title><source>SpringerNature Journals</source><source>EBSCOhost Business Source Complete</source><creator>Hutt, Stephen ; Krasich, Kristina ; Mills, Caitlin ; Bosch, Nigel ; White, Shelby ; Brockmole, James R. ; D’Mello, Sidney K.</creator><creatorcontrib>Hutt, Stephen ; Krasich, Kristina ; Mills, Caitlin ; Bosch, Nigel ; White, Shelby ; Brockmole, James R. ; D’Mello, Sidney K.</creatorcontrib><description>We investigate the use of commercial off-the-shelf (COTS) eye-trackers to automatically detect mind wandering—a phenomenon involving a shift in attention from task-related to task-unrelated thoughts—during computerized learning. Study 1 (
N
= 135 high-school students) tested the feasibility of COTS eye tracking while students learn biology with an intelligent tutoring system called GuruTutor in their classroom. We could successfully track eye gaze in 75% (both eyes tracked) and 95% (one eye tracked) of the cases for 85% of the sessions where gaze was successfully recorded. In Study 2, we used this data to build automated student-independent detectors of mind wandering, obtaining accuracies (mind wandering F
1
= 0.59) substantially better than chance (F
1
= 0.24). Study 3 investigated context-generalizability of mind wandering detectors, finding that models trained on data collected in a controlled laboratory more successfully generalized to the classroom than the reverse. Study 4 investigated gaze- and video- based mind wandering detection, finding that gaze-based detection was superior and multimodal detection yielded an improvement in limited circumstances. We tested live mind wandering detection on a new sample of 39 students in Study 5 and found that detection accuracy (mind wandering F
1
= 0.40) was considerably above chance (F1 = 0.24), albeit lower than offline detection accuracy from Study 1 (F
1
= 0.59), a finding attributable to handling of missing data. We discuss our next steps towards developing gaze-based attention-aware learning technologies to increase engagement and learning by combating mind wandering in classroom contexts.</description><identifier>ISSN: 0924-1868</identifier><identifier>EISSN: 1573-1391</identifier><identifier>DOI: 10.1007/s11257-019-09228-5</identifier><language>eng</language><publisher>Dordrecht: Springer Netherlands</publisher><subject>Accuracy ; Automation ; Classrooms ; Commercial off-the-shelf technology ; Computer Science ; Detectors ; Eye movements ; Feasibility studies ; Learning ; Management of Computing and Information Systems ; Missing data ; Multimedia Information Systems ; Students ; Tracking ; User Interfaces and Human Computer Interaction</subject><ispartof>User modeling and user-adapted interaction, 2019-09, Vol.29 (4), p.821-867</ispartof><rights>Springer Nature B.V. 2019</rights><rights>User Modeling and User-Adapted Interaction is a copyright of Springer, (2019). All Rights Reserved.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c429t-819e52d0ee5e1abe00f15f1fc559ae7e12eb1437bc77adcd0f7a97549497d7203</citedby><cites>FETCH-LOGICAL-c429t-819e52d0ee5e1abe00f15f1fc559ae7e12eb1437bc77adcd0f7a97549497d7203</cites><orcidid>0000-0002-7041-7472</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11257-019-09228-5$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11257-019-09228-5$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Hutt, Stephen</creatorcontrib><creatorcontrib>Krasich, Kristina</creatorcontrib><creatorcontrib>Mills, Caitlin</creatorcontrib><creatorcontrib>Bosch, Nigel</creatorcontrib><creatorcontrib>White, Shelby</creatorcontrib><creatorcontrib>Brockmole, James R.</creatorcontrib><creatorcontrib>D’Mello, Sidney K.</creatorcontrib><title>Automated gaze-based mind wandering detection during computerized learning in classrooms</title><title>User modeling and user-adapted interaction</title><addtitle>User Model User-Adap Inter</addtitle><description>We investigate the use of commercial off-the-shelf (COTS) eye-trackers to automatically detect mind wandering—a phenomenon involving a shift in attention from task-related to task-unrelated thoughts—during computerized learning. Study 1 (
N
= 135 high-school students) tested the feasibility of COTS eye tracking while students learn biology with an intelligent tutoring system called GuruTutor in their classroom. We could successfully track eye gaze in 75% (both eyes tracked) and 95% (one eye tracked) of the cases for 85% of the sessions where gaze was successfully recorded. In Study 2, we used this data to build automated student-independent detectors of mind wandering, obtaining accuracies (mind wandering F
1
= 0.59) substantially better than chance (F
1
= 0.24). Study 3 investigated context-generalizability of mind wandering detectors, finding that models trained on data collected in a controlled laboratory more successfully generalized to the classroom than the reverse. Study 4 investigated gaze- and video- based mind wandering detection, finding that gaze-based detection was superior and multimodal detection yielded an improvement in limited circumstances. We tested live mind wandering detection on a new sample of 39 students in Study 5 and found that detection accuracy (mind wandering F
1
= 0.40) was considerably above chance (F1 = 0.24), albeit lower than offline detection accuracy from Study 1 (F
1
= 0.59), a finding attributable to handling of missing data. We discuss our next steps towards developing gaze-based attention-aware learning technologies to increase engagement and learning by combating mind wandering in classroom contexts.</description><subject>Accuracy</subject><subject>Automation</subject><subject>Classrooms</subject><subject>Commercial off-the-shelf technology</subject><subject>Computer Science</subject><subject>Detectors</subject><subject>Eye movements</subject><subject>Feasibility studies</subject><subject>Learning</subject><subject>Management of Computing and Information Systems</subject><subject>Missing data</subject><subject>Multimedia Information Systems</subject><subject>Students</subject><subject>Tracking</subject><subject>User Interfaces and Human Computer Interaction</subject><issn>0924-1868</issn><issn>1573-1391</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9UMFKxDAQDaLguvoDngqeo5m02TTHZVFXWPCi4C2kyXTpsm1q0iLu15vdCt48zfDmvTczj5BbYPfAmHyIAFxIykBRpjgvqTgjMxAyp5ArOCezhBYUykV5Sa5i3LEkWkg1Ix_LcfCtGdBlW3NAWpmY2rbpXPZlOoeh6baZwwHt0Pguc-MJsL7txyEND4m8RxO6I9p0md2bGIP3bbwmF7XZR7z5rXPy_vT4tlrTzevzy2q5obbgaqAlKBTcMUSBYCpkrAZRQ22FUAYlAscKilxWVkrjrGO1NEqKQhVKOslZPid3k28f_OeIcdA7P4YurdSc51yVgskisfjEssGnA7HWfWhaE741MH1MUE8J6pSgPiWoRRLlkyj2x68x_Fn_o_oBuL11Tw</recordid><startdate>20190901</startdate><enddate>20190901</enddate><creator>Hutt, Stephen</creator><creator>Krasich, Kristina</creator><creator>Mills, Caitlin</creator><creator>Bosch, Nigel</creator><creator>White, Shelby</creator><creator>Brockmole, James R.</creator><creator>D’Mello, Sidney K.</creator><general>Springer Netherlands</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>88G</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8FL</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>FYUFA</scope><scope>F~G</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M2M</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-7041-7472</orcidid></search><sort><creationdate>20190901</creationdate><title>Automated gaze-based mind wandering detection during computerized learning in classrooms</title><author>Hutt, Stephen ; Krasich, Kristina ; Mills, Caitlin ; Bosch, Nigel ; White, Shelby ; Brockmole, James R. ; D’Mello, Sidney K.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c429t-819e52d0ee5e1abe00f15f1fc559ae7e12eb1437bc77adcd0f7a97549497d7203</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Accuracy</topic><topic>Automation</topic><topic>Classrooms</topic><topic>Commercial off-the-shelf technology</topic><topic>Computer Science</topic><topic>Detectors</topic><topic>Eye movements</topic><topic>Feasibility studies</topic><topic>Learning</topic><topic>Management of Computing and Information Systems</topic><topic>Missing data</topic><topic>Multimedia Information Systems</topic><topic>Students</topic><topic>Tracking</topic><topic>User Interfaces and Human Computer Interaction</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hutt, Stephen</creatorcontrib><creatorcontrib>Krasich, Kristina</creatorcontrib><creatorcontrib>Mills, Caitlin</creatorcontrib><creatorcontrib>Bosch, Nigel</creatorcontrib><creatorcontrib>White, Shelby</creatorcontrib><creatorcontrib>Brockmole, James R.</creatorcontrib><creatorcontrib>D’Mello, Sidney K.</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Access via ABI/INFORM (ProQuest)</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Psychology Database (Alumni)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>Health Research Premium Collection</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Psychology Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><collection>ProQuest Central Basic</collection><jtitle>User modeling and user-adapted interaction</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hutt, Stephen</au><au>Krasich, Kristina</au><au>Mills, Caitlin</au><au>Bosch, Nigel</au><au>White, Shelby</au><au>Brockmole, James R.</au><au>D’Mello, Sidney K.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Automated gaze-based mind wandering detection during computerized learning in classrooms</atitle><jtitle>User modeling and user-adapted interaction</jtitle><stitle>User Model User-Adap Inter</stitle><date>2019-09-01</date><risdate>2019</risdate><volume>29</volume><issue>4</issue><spage>821</spage><epage>867</epage><pages>821-867</pages><issn>0924-1868</issn><eissn>1573-1391</eissn><abstract>We investigate the use of commercial off-the-shelf (COTS) eye-trackers to automatically detect mind wandering—a phenomenon involving a shift in attention from task-related to task-unrelated thoughts—during computerized learning. Study 1 (
N
= 135 high-school students) tested the feasibility of COTS eye tracking while students learn biology with an intelligent tutoring system called GuruTutor in their classroom. We could successfully track eye gaze in 75% (both eyes tracked) and 95% (one eye tracked) of the cases for 85% of the sessions where gaze was successfully recorded. In Study 2, we used this data to build automated student-independent detectors of mind wandering, obtaining accuracies (mind wandering F
1
= 0.59) substantially better than chance (F
1
= 0.24). Study 3 investigated context-generalizability of mind wandering detectors, finding that models trained on data collected in a controlled laboratory more successfully generalized to the classroom than the reverse. Study 4 investigated gaze- and video- based mind wandering detection, finding that gaze-based detection was superior and multimodal detection yielded an improvement in limited circumstances. We tested live mind wandering detection on a new sample of 39 students in Study 5 and found that detection accuracy (mind wandering F
1
= 0.40) was considerably above chance (F1 = 0.24), albeit lower than offline detection accuracy from Study 1 (F
1
= 0.59), a finding attributable to handling of missing data. We discuss our next steps towards developing gaze-based attention-aware learning technologies to increase engagement and learning by combating mind wandering in classroom contexts.</abstract><cop>Dordrecht</cop><pub>Springer Netherlands</pub><doi>10.1007/s11257-019-09228-5</doi><tpages>47</tpages><orcidid>https://orcid.org/0000-0002-7041-7472</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0924-1868 |
ispartof | User modeling and user-adapted interaction, 2019-09, Vol.29 (4), p.821-867 |
issn | 0924-1868 1573-1391 |
language | eng |
recordid | cdi_proquest_journals_2232985074 |
source | SpringerNature Journals; EBSCOhost Business Source Complete |
subjects | Accuracy Automation Classrooms Commercial off-the-shelf technology Computer Science Detectors Eye movements Feasibility studies Learning Management of Computing and Information Systems Missing data Multimedia Information Systems Students Tracking User Interfaces and Human Computer Interaction |
title | Automated gaze-based mind wandering detection during computerized learning in classrooms |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-20T11%3A13%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Automated%20gaze-based%20mind%20wandering%20detection%20during%20computerized%20learning%20in%20classrooms&rft.jtitle=User%20modeling%20and%20user-adapted%20interaction&rft.au=Hutt,%20Stephen&rft.date=2019-09-01&rft.volume=29&rft.issue=4&rft.spage=821&rft.epage=867&rft.pages=821-867&rft.issn=0924-1868&rft.eissn=1573-1391&rft_id=info:doi/10.1007/s11257-019-09228-5&rft_dat=%3Cproquest_cross%3E2232985074%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2232985074&rft_id=info:pmid/&rfr_iscdi=true |