Revolutionizing Gaze-Based Human-Computer Interaction Using Iris Tracking: A Webcam-Based Low-Cost Approach With Calibration, Regression and Real-Time Re-Calibration

Eye movements are essential in human-computer interaction (HCI) because they offer insights into individuals' cognitive states and visual attention. Techniques for adequately assessing gaze have increased in the last two decades. Notably, video-based tracking methods have gained considerable in...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.168256-168269
Hauptverfasser: Chhimpa, Govind Ram, Kumar, Ajay, Garhwal, Sunita, Dhiraj, Khan, Faheem, Moon, Yeon-Kug
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Eye movements are essential in human-computer interaction (HCI) because they offer insights into individuals' cognitive states and visual attention. Techniques for adequately assessing gaze have increased in the last two decades. Notably, video-based tracking methods have gained considerable interest within the research community due to their nonintrusive nature, enabling precise and convenient gaze estimation without physical contact or invasive measures. This paper introduces a video-based gaze-tracking method that presents an affordable, user-friendly, and dependable human-computer interaction (HCI) system based on iris movement. By utilizing the MediaPipe face mesh model, facial features are extracted from real-time video sequences. A 5-point user-specific calibration and multiple regression techniques are employed to predict the gaze point on the screen accurately. The proposed system effectively handles changes in body position and user posture through real-time re-calibration using z-index tracking. Furthermore, it compensates for minor head movements that may introduce inaccuracies. The proposed system is cost-effective, with a general cost below 25{\} , which may vary based on camera usage. Thirteen participants were involved in the system testing. The system demonstrates a high level of sensitivity to low light conditions, a strong response to changes in distance, and a moderate reaction to glasses, with an average frame processing time of 0.047 seconds. On average, it achieves a visual angle accuracy of 1.12 degrees with head movement and 1.3 degrees without head movement.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3498441