A comparative evaluation of direct hand and wand interactions on consumer devices

•Evaluation of two input techniques for VR applications: wands and hands.•Elaboration of five distinct test cases comprising different interaction metaphors.•Proposition of an interaction layer for enhancing Unity development.•Proposition of new ray casting interaction method based on hand tracking....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers & graphics 2018-12, Vol.77, p.108-121
Hauptverfasser: Figueiredo, Lucas, Rodrigues, Eduardo, Teixeira, João, Teichrieb, Veronica
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Evaluation of two input techniques for VR applications: wands and hands.•Elaboration of five distinct test cases comprising different interaction metaphors.•Proposition of an interaction layer for enhancing Unity development.•Proposition of new ray casting interaction method based on hand tracking.•Delivery of a set of metaphors that can be used in both VR, AR and MR applications. [Display omitted] Along with the popularization of VR Head Mounted Displays, there is an increasing demand for understanding how to use these devices within VR applications. This work evaluates the use of two input techniques for VR applications: wands and hands. We perform experiments using consumer devices (Leap Motion Controller and HTC Vive), aiming at understanding how popular hardware respond to users’ needs. Five distinct scenarios were tested, exploring both near and far object interaction. The evaluation happens in three steps: user profile evaluation, system performance evaluation, and the System Usability Scale questionnaire. The results showed that even with a lower task accuracy, natural interaction provided by using a hand representation in the virtual world gained user’s preference when interacting with virtual elements that were close to the user. For distant object interaction, it still needs some improvement.
ISSN:0097-8493
1873-7684
DOI:10.1016/j.cag.2018.10.006