Aladdin's magic carpet: Navigation by in-air static hand gesture in autonomous vehicles

This paper presents a novel and exploratory investigation of how users might control future autonomous vehicles with user-defined in-air static hand gestures. In the era of autonomous vehicles, how to support "driving" without steering control may become a key issue affecting user experien...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of human-computer interaction 2020-12, Vol.36 (20), p.1912-1927
Hauptverfasser: Qian, Xiaosong, Ju, Wendy, Sirkin, David Michael
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents a novel and exploratory investigation of how users might control future autonomous vehicles with user-defined in-air static hand gestures. In the era of autonomous vehicles, how to support "driving" without steering control may become a key issue affecting user experience. As the navigation interface of future autonomous cars will be wholly dependent on GPS, and passengers will be unable to make manual control adjustments, verbal or gestural communication will become a primary means to assist them in vehicle control. We thus focus on gesture as an innovative solution for this "final 100 meters" problem in automated navigation. A study (N = 24) conducted in a full chassis simulator shows that hand gesture control is feasible for autonomous vehicle navigation. It further reveals that hand gestures are influenced by task types, local regional norms, and participant culture. In particular, the spatial region where gestures occur can affect execution time, gender can impact user preference and demand, and culture differences and priorities can affect adoption, ease of learning, and user comfort, influencing longer-term use.
ISSN:1044-7318
1532-7590
1044-7318
DOI:10.1080/10447318.2020.1801225