Fundamental differences in change detection between vision and audition

We compared auditory change detection to visual change detection using closely matched stimuli and tasks in the two modalities. On each trial, participants were presented with a test stimulus consisting of ten elements: pure tones with various frequencies for audition, or dots with various spatial p...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Experimental brain research 2010-06, Vol.203 (2), p.261-270
Hauptverfasser: Demany, Laurent, Semal, Catherine, Cazalets, Jean-René, Pressnitzer, Daniel
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We compared auditory change detection to visual change detection using closely matched stimuli and tasks in the two modalities. On each trial, participants were presented with a test stimulus consisting of ten elements: pure tones with various frequencies for audition, or dots with various spatial positions for vision. The test stimulus was preceded or followed by a probe stimulus consisting of a single element, and two change-detection tasks were performed. In the “present/absent” task, the probe either matched one randomly selected element of the test stimulus or none of them; participants reported present or absent. In the “direction-judgment” task, the probe was always slightly shifted relative to one randomly selected element of the test stimulus; participants reported the direction of the shift. Whereas visual performance was systematically better in the present/absent task than in the direction-judgment task, the opposite was true for auditory performance. Moreover, whereas visual performance was strongly dependent on selective attention and on the time interval separating the probe from the test stimulus, this was not the case for auditory performance. Our results show that small auditory changes can be detected automatically across relatively long temporal gaps, using an implicit memory system that seems to have no similar counterpart in the visual domain.
ISSN:0014-4819
1432-1106
DOI:10.1007/s00221-010-2226-2