Gaze-based interaction: A 30 year retrospective
•A survey of gaze-based interaction is given, focusing on 4 parts.•The four parts include diagnostic, active, passive, and expressing applications.•Seminal results and recent advancements are reviewed. [Display omitted] Gaze-based interaction is reviewed, categorized within a taxonomy that splits in...
Gespeichert in:
Veröffentlicht in: | Computers & graphics 2018-06, Vol.73, p.59-69 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •A survey of gaze-based interaction is given, focusing on 4 parts.•The four parts include diagnostic, active, passive, and expressing applications.•Seminal results and recent advancements are reviewed.
[Display omitted]
Gaze-based interaction is reviewed, categorized within a taxonomy that splits interaction into four forms, namely diagnostic (off-line measurement), active (selection, look to shoot), passive (foveated rendering, a.k.a. gaze-contingent displays), and expressive (gaze synthesis). Diagnostic interaction is the mainstay of eye-tracked applications, including training or assessment of expertise, and is possibly the longest standing use of gaze due to its mainly offline requirements. Diagnostic analysis of gaze is still very much in demand, especially in training situations such as flight or surgery training. Active interaction is rooted in the desire to use the eyes to point and click, with gaze gestures growing in popularity. Passive interaction is the manipulation of scene elements in response to gaze direction, e.g., to improve frame rate. Expressive eye movement is drawn from its synthesis, which can make use of a procedural (stochastic) model of eye motion driven by goal-oriented tasks such as reading. In discussing each form of interaction, seminal results and recent advancements are reviewed, highlighting outstanding research problems. The survey paper extends an invited proceedings contribution to VS-Games 2017. |
---|---|
ISSN: | 0097-8493 1873-7684 |
DOI: | 10.1016/j.cag.2018.04.002 |