Aural localization of silent objects by active human biosonar: neural representations of virtual echo-acoustic space

Some blind humans have developed the remarkable ability to detect and localize objects through the auditory analysis of self‐generated tongue clicks. These echolocation experts show a corresponding increase in ‘visual’ cortex activity when listening to echo‐acoustic sounds. Echolocation in real‐life...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The European journal of neuroscience 2015-03, Vol.41 (5), p.533-545
Hauptverfasser: Wallmeier, Ludwig, Kish, Daniel, Wiegrebe, Lutz, Flanagin, Virginia L.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Some blind humans have developed the remarkable ability to detect and localize objects through the auditory analysis of self‐generated tongue clicks. These echolocation experts show a corresponding increase in ‘visual’ cortex activity when listening to echo‐acoustic sounds. Echolocation in real‐life settings involves multiple reflections as well as active sound production, neither of which has been systematically addressed. We developed a virtualization technique that allows participants to actively perform such biosonar tasks in virtual echo‐acoustic space during magnetic resonance imaging (MRI). Tongue clicks, emitted in the MRI scanner, are picked up by a microphone, convolved in real time with the binaural impulse responses of a virtual space, and presented via headphones as virtual echoes. In this manner, we investigated the brain activity during active echo‐acoustic localization tasks. Our data show that, in blind echolocation experts, activations in the calcarine cortex are dramatically enhanced when a single reflector is introduced into otherwise anechoic virtual space. A pattern‐classification analysis revealed that, in the blind, calcarine cortex activation patterns could discriminate left‐side from right‐side reflectors. This was found in both blind experts, but the effect was significant for only one of them. In sighted controls, ‘visual’ cortex activations were insignificant, but activation patterns in the planum temporale were sufficient to discriminate left‐side from right‐side reflectors. Our data suggest that blind and echolocation‐trained, sighted subjects may recruit different neural substrates for the same active‐echolocation task. In the current study, we present a new technique allowing active echolocation experiments in the fMRI scanner. Using this technique, we conducted a psychophysical experiment with two blind echolocation experts and four sighted control subjects in the fMRI scanner, showing that sound reflector localization through active echolocation is possible in an MRI environment. Finally, we present imaging data in terms of BOLD activations when subjects solve active echo‐acoustic tasks in the scanner, including both its motor‐ and sensory components.
ISSN:0953-816X
1460-9568
DOI:10.1111/ejn.12843