Revolutionizing Gaze-Based Human-Computer Interaction Using Iris Tracking: A Webcam-Based Low-Cost Approach With Calibration, Regression and Real-Time Re-Calibration
Eye movements are essential in human-computer interaction (HCI) because they offer insights into individuals' cognitive states and visual attention. Techniques for adequately assessing gaze have increased in the last two decades. Notably, video-based tracking methods have gained considerable in...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024, Vol.12, p.168256-168269 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 168269 |
---|---|
container_issue | |
container_start_page | 168256 |
container_title | IEEE access |
container_volume | 12 |
creator | Chhimpa, Govind Ram Kumar, Ajay Garhwal, Sunita Dhiraj Khan, Faheem Moon, Yeon-Kug |
description | Eye movements are essential in human-computer interaction (HCI) because they offer insights into individuals' cognitive states and visual attention. Techniques for adequately assessing gaze have increased in the last two decades. Notably, video-based tracking methods have gained considerable interest within the research community due to their nonintrusive nature, enabling precise and convenient gaze estimation without physical contact or invasive measures. This paper introduces a video-based gaze-tracking method that presents an affordable, user-friendly, and dependable human-computer interaction (HCI) system based on iris movement. By utilizing the MediaPipe face mesh model, facial features are extracted from real-time video sequences. A 5-point user-specific calibration and multiple regression techniques are employed to predict the gaze point on the screen accurately. The proposed system effectively handles changes in body position and user posture through real-time re-calibration using z-index tracking. Furthermore, it compensates for minor head movements that may introduce inaccuracies. The proposed system is cost-effective, with a general cost below 25{\} , which may vary based on camera usage. Thirteen participants were involved in the system testing. The system demonstrates a high level of sensitivity to low light conditions, a strong response to changes in distance, and a moderate reaction to glasses, with an average frame processing time of 0.047 seconds. On average, it achieves a visual angle accuracy of 1.12 degrees with head movement and 1.3 degrees without head movement. |
doi_str_mv | 10.1109/ACCESS.2024.3498441 |
format | Article |
fullrecord | <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_proquest_journals_3130926425</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10752957</ieee_id><doaj_id>oai_doaj_org_article_21ad3ac72299486a9497a31f1eb60713</doaj_id><sourcerecordid>3130926425</sourcerecordid><originalsourceid>FETCH-LOGICAL-c244t-7e43782315d3e54045a3f4ec1bec893984ae4b7c28722979a8c67f9da410b4ac3</originalsourceid><addsrcrecordid>eNpNUU1v1DAUjBBIVKW_AA6WuJIl_kgcc1ui0q60ElK7VY_Wi_Oy9ZLEi50U0f_D_8QhK7Q-2O-NZubZniR5T7MVpZn6vK6q6_v7FcuYWHGhSiHoq-SC0UKlPOfF67P6bXIVwiGLq4xQLi-SP3f47LpptG6wL3bYkxt4wfQrBGzI7dTDkFauP04jerIZ4g5mppKHMHM33gayi9iP2H0ha_KItYH-JN-6X1EcRrI-Hr0D80Qe7fhEKuhs7WG2-UTucO8xhNkShia20KU722Os0jPiu-RNC13Aq9N5mTx8u95Vt-n2-82mWm9Tw4QYU4mCy5Jxmjccc5GJHHgr0NAaTal4_BpAUUvDSsmYkgpKU8hWNSBoVgsw_DLZLL6Ng4M-etuD_60dWP0PcH6vwY_WdKgZhYaDmY2UKAtQQkngtKVYF5mkPHp9XLzi439OGEZ9cJMf4vU1pzxTrBAsjyy-sIx3IXhs_0-lmZ7j1Uu8eo5Xn-KNqg-LyiLimULmLKbK_wKHZ6Ff</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3130926425</pqid></control><display><type>article</type><title>Revolutionizing Gaze-Based Human-Computer Interaction Using Iris Tracking: A Webcam-Based Low-Cost Approach With Calibration, Regression and Real-Time Re-Calibration</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Chhimpa, Govind Ram ; Kumar, Ajay ; Garhwal, Sunita ; Dhiraj ; Khan, Faheem ; Moon, Yeon-Kug</creator><creatorcontrib>Chhimpa, Govind Ram ; Kumar, Ajay ; Garhwal, Sunita ; Dhiraj ; Khan, Faheem ; Moon, Yeon-Kug</creatorcontrib><description>Eye movements are essential in human-computer interaction (HCI) because they offer insights into individuals' cognitive states and visual attention. Techniques for adequately assessing gaze have increased in the last two decades. Notably, video-based tracking methods have gained considerable interest within the research community due to their nonintrusive nature, enabling precise and convenient gaze estimation without physical contact or invasive measures. This paper introduces a video-based gaze-tracking method that presents an affordable, user-friendly, and dependable human-computer interaction (HCI) system based on iris movement. By utilizing the MediaPipe face mesh model, facial features are extracted from real-time video sequences. A 5-point user-specific calibration and multiple regression techniques are employed to predict the gaze point on the screen accurately. The proposed system effectively handles changes in body position and user posture through real-time re-calibration using z-index tracking. Furthermore, it compensates for minor head movements that may introduce inaccuracies. The proposed system is cost-effective, with a general cost below <inline-formula> <tex-math notation="LaTeX">25{\} </tex-math></inline-formula>, which may vary based on camera usage. Thirteen participants were involved in the system testing. The system demonstrates a high level of sensitivity to low light conditions, a strong response to changes in distance, and a moderate reaction to glasses, with an average frame processing time of 0.047 seconds. On average, it achieves a visual angle accuracy of 1.12 degrees with head movement and 1.3 degrees without head movement.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2024.3498441</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Calibration ; Eye movements ; eye-gaze tracking ; Head ; Head movement ; Human motion ; Human-computer interaction ; Human-computer interface ; iris-tracking ; low-cost ; Real time ; real-time re-calibration ; regression ; Sequences ; Tracking</subject><ispartof>IEEE access, 2024, Vol.12, p.168256-168269</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0009-0008-3337-9578 ; 0000-0003-3972-8368 ; 0000-0001-6220-0225 ; 0000-0002-8959-3724 ; 0000-0002-9142-2721 ; 0000-0002-4452-4725</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10752957$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2102,4024,27633,27923,27924,27925,54933</link.rule.ids></links><search><creatorcontrib>Chhimpa, Govind Ram</creatorcontrib><creatorcontrib>Kumar, Ajay</creatorcontrib><creatorcontrib>Garhwal, Sunita</creatorcontrib><creatorcontrib>Dhiraj</creatorcontrib><creatorcontrib>Khan, Faheem</creatorcontrib><creatorcontrib>Moon, Yeon-Kug</creatorcontrib><title>Revolutionizing Gaze-Based Human-Computer Interaction Using Iris Tracking: A Webcam-Based Low-Cost Approach With Calibration, Regression and Real-Time Re-Calibration</title><title>IEEE access</title><addtitle>Access</addtitle><description>Eye movements are essential in human-computer interaction (HCI) because they offer insights into individuals' cognitive states and visual attention. Techniques for adequately assessing gaze have increased in the last two decades. Notably, video-based tracking methods have gained considerable interest within the research community due to their nonintrusive nature, enabling precise and convenient gaze estimation without physical contact or invasive measures. This paper introduces a video-based gaze-tracking method that presents an affordable, user-friendly, and dependable human-computer interaction (HCI) system based on iris movement. By utilizing the MediaPipe face mesh model, facial features are extracted from real-time video sequences. A 5-point user-specific calibration and multiple regression techniques are employed to predict the gaze point on the screen accurately. The proposed system effectively handles changes in body position and user posture through real-time re-calibration using z-index tracking. Furthermore, it compensates for minor head movements that may introduce inaccuracies. The proposed system is cost-effective, with a general cost below <inline-formula> <tex-math notation="LaTeX">25{\} </tex-math></inline-formula>, which may vary based on camera usage. Thirteen participants were involved in the system testing. The system demonstrates a high level of sensitivity to low light conditions, a strong response to changes in distance, and a moderate reaction to glasses, with an average frame processing time of 0.047 seconds. On average, it achieves a visual angle accuracy of 1.12 degrees with head movement and 1.3 degrees without head movement.</description><subject>Calibration</subject><subject>Eye movements</subject><subject>eye-gaze tracking</subject><subject>Head</subject><subject>Head movement</subject><subject>Human motion</subject><subject>Human-computer interaction</subject><subject>Human-computer interface</subject><subject>iris-tracking</subject><subject>low-cost</subject><subject>Real time</subject><subject>real-time re-calibration</subject><subject>regression</subject><subject>Sequences</subject><subject>Tracking</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUU1v1DAUjBBIVKW_AA6WuJIl_kgcc1ui0q60ElK7VY_Wi_Oy9ZLEi50U0f_D_8QhK7Q-2O-NZubZniR5T7MVpZn6vK6q6_v7FcuYWHGhSiHoq-SC0UKlPOfF67P6bXIVwiGLq4xQLi-SP3f47LpptG6wL3bYkxt4wfQrBGzI7dTDkFauP04jerIZ4g5mppKHMHM33gayi9iP2H0ha_KItYH-JN-6X1EcRrI-Hr0D80Qe7fhEKuhs7WG2-UTucO8xhNkShia20KU722Os0jPiu-RNC13Aq9N5mTx8u95Vt-n2-82mWm9Tw4QYU4mCy5Jxmjccc5GJHHgr0NAaTal4_BpAUUvDSsmYkgpKU8hWNSBoVgsw_DLZLL6Ng4M-etuD_60dWP0PcH6vwY_WdKgZhYaDmY2UKAtQQkngtKVYF5mkPHp9XLzi439OGEZ9cJMf4vU1pzxTrBAsjyy-sIx3IXhs_0-lmZ7j1Uu8eo5Xn-KNqg-LyiLimULmLKbK_wKHZ6Ff</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Chhimpa, Govind Ram</creator><creator>Kumar, Ajay</creator><creator>Garhwal, Sunita</creator><creator>Dhiraj</creator><creator>Khan, Faheem</creator><creator>Moon, Yeon-Kug</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0009-0008-3337-9578</orcidid><orcidid>https://orcid.org/0000-0003-3972-8368</orcidid><orcidid>https://orcid.org/0000-0001-6220-0225</orcidid><orcidid>https://orcid.org/0000-0002-8959-3724</orcidid><orcidid>https://orcid.org/0000-0002-9142-2721</orcidid><orcidid>https://orcid.org/0000-0002-4452-4725</orcidid></search><sort><creationdate>2024</creationdate><title>Revolutionizing Gaze-Based Human-Computer Interaction Using Iris Tracking: A Webcam-Based Low-Cost Approach With Calibration, Regression and Real-Time Re-Calibration</title><author>Chhimpa, Govind Ram ; Kumar, Ajay ; Garhwal, Sunita ; Dhiraj ; Khan, Faheem ; Moon, Yeon-Kug</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c244t-7e43782315d3e54045a3f4ec1bec893984ae4b7c28722979a8c67f9da410b4ac3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Calibration</topic><topic>Eye movements</topic><topic>eye-gaze tracking</topic><topic>Head</topic><topic>Head movement</topic><topic>Human motion</topic><topic>Human-computer interaction</topic><topic>Human-computer interface</topic><topic>iris-tracking</topic><topic>low-cost</topic><topic>Real time</topic><topic>real-time re-calibration</topic><topic>regression</topic><topic>Sequences</topic><topic>Tracking</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Chhimpa, Govind Ram</creatorcontrib><creatorcontrib>Kumar, Ajay</creatorcontrib><creatorcontrib>Garhwal, Sunita</creatorcontrib><creatorcontrib>Dhiraj</creatorcontrib><creatorcontrib>Khan, Faheem</creatorcontrib><creatorcontrib>Moon, Yeon-Kug</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005–Present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Chhimpa, Govind Ram</au><au>Kumar, Ajay</au><au>Garhwal, Sunita</au><au>Dhiraj</au><au>Khan, Faheem</au><au>Moon, Yeon-Kug</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Revolutionizing Gaze-Based Human-Computer Interaction Using Iris Tracking: A Webcam-Based Low-Cost Approach With Calibration, Regression and Real-Time Re-Calibration</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2024</date><risdate>2024</risdate><volume>12</volume><spage>168256</spage><epage>168269</epage><pages>168256-168269</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Eye movements are essential in human-computer interaction (HCI) because they offer insights into individuals' cognitive states and visual attention. Techniques for adequately assessing gaze have increased in the last two decades. Notably, video-based tracking methods have gained considerable interest within the research community due to their nonintrusive nature, enabling precise and convenient gaze estimation without physical contact or invasive measures. This paper introduces a video-based gaze-tracking method that presents an affordable, user-friendly, and dependable human-computer interaction (HCI) system based on iris movement. By utilizing the MediaPipe face mesh model, facial features are extracted from real-time video sequences. A 5-point user-specific calibration and multiple regression techniques are employed to predict the gaze point on the screen accurately. The proposed system effectively handles changes in body position and user posture through real-time re-calibration using z-index tracking. Furthermore, it compensates for minor head movements that may introduce inaccuracies. The proposed system is cost-effective, with a general cost below <inline-formula> <tex-math notation="LaTeX">25{\} </tex-math></inline-formula>, which may vary based on camera usage. Thirteen participants were involved in the system testing. The system demonstrates a high level of sensitivity to low light conditions, a strong response to changes in distance, and a moderate reaction to glasses, with an average frame processing time of 0.047 seconds. On average, it achieves a visual angle accuracy of 1.12 degrees with head movement and 1.3 degrees without head movement.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2024.3498441</doi><tpages>14</tpages><orcidid>https://orcid.org/0009-0008-3337-9578</orcidid><orcidid>https://orcid.org/0000-0003-3972-8368</orcidid><orcidid>https://orcid.org/0000-0001-6220-0225</orcidid><orcidid>https://orcid.org/0000-0002-8959-3724</orcidid><orcidid>https://orcid.org/0000-0002-9142-2721</orcidid><orcidid>https://orcid.org/0000-0002-4452-4725</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2169-3536 |
ispartof | IEEE access, 2024, Vol.12, p.168256-168269 |
issn | 2169-3536 2169-3536 |
language | eng |
recordid | cdi_proquest_journals_3130926425 |
source | IEEE Open Access Journals; DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals |
subjects | Calibration Eye movements eye-gaze tracking Head Head movement Human motion Human-computer interaction Human-computer interface iris-tracking low-cost Real time real-time re-calibration regression Sequences Tracking |
title | Revolutionizing Gaze-Based Human-Computer Interaction Using Iris Tracking: A Webcam-Based Low-Cost Approach With Calibration, Regression and Real-Time Re-Calibration |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T00%3A13%3A02IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Revolutionizing%20Gaze-Based%20Human-Computer%20Interaction%20Using%20Iris%20Tracking:%20A%20Webcam-Based%20Low-Cost%20Approach%20With%20Calibration,%20Regression%20and%20Real-Time%20Re-Calibration&rft.jtitle=IEEE%20access&rft.au=Chhimpa,%20Govind%20Ram&rft.date=2024&rft.volume=12&rft.spage=168256&rft.epage=168269&rft.pages=168256-168269&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2024.3498441&rft_dat=%3Cproquest_ieee_%3E3130926425%3C/proquest_ieee_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3130926425&rft_id=info:pmid/&rft_ieee_id=10752957&rft_doaj_id=oai_doaj_org_article_21ad3ac72299486a9497a31f1eb60713&rfr_iscdi=true |