Real-Time Gaze Tracking with Event-Driven Eye Segmentation
Gaze tracking is increasingly becoming an essential component in Augmented and Virtual Reality. Modern gaze tracking al gorithms are heavyweight; they operate at most 5 Hz on mobile processors despite that near-eye cameras comfortably operate at a r eal-time rate ($>$ 30 Hz). This paper presents...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Gaze tracking is increasingly becoming an essential component in Augmented
and Virtual Reality. Modern gaze tracking al gorithms are heavyweight; they
operate at most 5 Hz on mobile processors despite that near-eye cameras
comfortably operate at a r eal-time rate ($>$ 30 Hz). This paper presents a
real-time eye tracking algorithm that, on average, operates at 30 Hz on a
mobile processor, achieves \ang{0.1}--\ang{0.5} gaze accuracies, all the while
requiring only 30K parameters, one to two orders of magn itude smaller than
state-of-the-art eye tracking algorithms. The crux of our algorithm is an
Auto~ROI mode, which continuously pr edicts the Regions of Interest (ROIs) of
near-eye images and judiciously processes only the ROIs for gaze estimation. To
that end, we introduce a novel, lightweight ROI prediction algorithm by
emulating an event camera. We discuss how a software emulation of events
enables accurate ROI prediction without requiring special hardware. The code of
our paper is available at https://github.com/horizon-research/edgaze. |
---|---|
DOI: | 10.48550/arxiv.2201.07367 |