The impact of a visual spatial frame on real sound-source localization in virtual reality

•Virtual reality and kinematic tracking open new scenarios for the study of spatial hearing.•Here we examined the impact of a minimal visual spatial frame on sound localization.•The presence of a visual spatial frame improved hand-pointing in elevation.•Seeing a visual grid resulted in faster first...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Current research in behavioral sciences 2020-11, Vol.1, p.100003, Article 100003
Hauptverfasser: Valzolgher, Chiara, Alzhaler, Mariam, Gessa, Elena, Todeschini, Michela, Nieto, Pauline, Verdelet, Gregoire, Salemme, Romeo, Gaveau, Valerie, Marx, Mathieu, Truy, Eric, Barone, Pascal, Farnè, Alessandro, Pavani, Francesco
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Virtual reality and kinematic tracking open new scenarios for the study of spatial hearing.•Here we examined the impact of a minimal visual spatial frame on sound localization.•The presence of a visual spatial frame improved hand-pointing in elevation.•Seeing a visual grid resulted in faster first gaze-movements to sounds.•Sound localization benefits from the presence of a minimal visual spatial frame. Studies on audio-visual interactions in sound localization have primarily focused on the relations between the spatial position of sounds and their perceived visual source, as in the famous ventriloquist effect. Much less work has examined the effects on sound localization of seeing aspects of the visual environment. In this study, we took advantage of an innovative method for the study of spatial hearing – based on real sounds, virtual reality and real-time kinematic tracking – to examine the impact of a minimal visual spatial frame on sound localization. We tested sound localization in normal hearing participants (N=36) in two visual conditions: a uniform gray scene and a simple visual environment comprising only a grid. In both cases, no visual cues about the sound sources were provided. During and after sound emission, participants were free to move their head and eyes without restriction. We found that the presence of a visual spatial frame improved hand-pointing in elevation. In addition, it determined faster first-gaze movements to sounds. Our findings show that sound localization benefits from the presence of a minimal visual spatial frame and confirm the importance of combining kinematic tracking and virtual reality when aiming to reveal the multisensory and motor contributions to spatial-hearing abilities. [Display omitted]
ISSN:2666-5182
2666-5182
DOI:10.1016/j.crbeha.2020.100003