SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XR
We introduce SonoHaptics, an audio-haptic cursor for gaze-based 3D object selection. SonoHaptics addresses challenges around providing accurate visual feedback during gaze-based selection in Extended Reality (XR), e.g., lack of world-locked displays in no- or limited-display smart glasses and visual...
Gespeichert in:
Hauptverfasser: | , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We introduce SonoHaptics, an audio-haptic cursor for gaze-based 3D object
selection. SonoHaptics addresses challenges around providing accurate visual
feedback during gaze-based selection in Extended Reality (XR), e.g., lack of
world-locked displays in no- or limited-display smart glasses and visual
inconsistencies. To enable users to distinguish objects without visual
feedback, SonoHaptics employs the concept of cross-modal correspondence in
human perception to map visual features of objects (color, size, position,
material) to audio-haptic properties (pitch, amplitude, direction, timbre). We
contribute data-driven models for determining cross-modal mappings of visual
features to audio and haptic features, and a computational approach to
automatically generate audio-haptic feedback for objects in the user's
environment. SonoHaptics provides global feedback that is unique to each object
in the scene, and local feedback to amplify differences between nearby objects.
Our comparative evaluation shows that SonoHaptics enables accurate object
identification and selection in a cluttered scene without visual feedback. |
---|---|
DOI: | 10.48550/arxiv.2409.00784 |