i-VALS: Visual Attention Localization for Mobile Service Computing

Identifying and localizing the user's visual attention can enable various intelligent service computing paradigms in a mobile environment. However, existing solutions can only compute the gaze direction, but without the distance to the intended target. In addition, most of them rely on eye trac...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019, Vol.7, p.45166-45181
Hauptverfasser: Jiang, Zhiping, Zhao, Kun, Li, Rui, Zhao, Jizhong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Identifying and localizing the user's visual attention can enable various intelligent service computing paradigms in a mobile environment. However, existing solutions can only compute the gaze direction, but without the distance to the intended target. In addition, most of them rely on eye tracker or similar infrastructure support. This paper explores the possibility of using portal mobile devices, e.g., smartphone, to detect the visual attention of a user. i-VALS only requires the user to do one simple action to localize the intended object: gazing at the intended object and holding up the smartphone so that the object and the user's face can be simultaneously captured by the front and rear cameras. We develop efficient algorithms to obtain both the distance between the camera and user, the user's gaze direction and the object's direction from the camera. The object's location can then be computed by solving a trigonometric problem. i-VALS has been prototyped on commercial off-the-shelf (COTS) devices. The extensive experiment results show that i-VALS achieves high accuracy and small latency, effectively supporting a large variety of applications in smart environments.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2907147