Midbrain Combinatorial Code for Temporal and Spectral Information in Concurrent Acoustic Signals
1 Section of Neurobiology and Behavior, Cornell University, Ithaca, New York 14853; and 2 University of California Bodega Marine Laboratory, Bodega Bay, California 94923 Midbrain combinatorial code for temporal and spectral Information in concurrent acoustic signals. All vocal species, including...
Gespeichert in:
Veröffentlicht in: | Journal of neurophysiology 1999-02, Vol.81 (2), p.552-563 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | 1 Section of Neurobiology and Behavior,
Cornell University, Ithaca, New York 14853; and
2 University of California Bodega Marine
Laboratory, Bodega Bay, California 94923
Midbrain combinatorial code for temporal and spectral Information in
concurrent acoustic signals. All vocal species, including humans, often encounter simultaneous (concurrent) vocal signals from
conspecifics. To segregate concurrent signals, the auditory system must
extract information regarding the individual signals from their summed
waveforms. During the breeding season, nesting male midshipman fish
( Porichthys notatus ) congregate in localized regions of the
intertidal zone and produce long-duration (>1 min), multi-harmonic
signals ("hums") during courtship of females. The hums of
neighboring males often overlap, resulting in acoustic beats with
amplitude and phase modulations at the difference frequencies (dFs)
between their fundamental frequencies ( F 0 s) and
harmonic components. Behavioral studies also show that midshipman can
localize a single hum-like tone when presented with a choice between
two concurrent tones that originate from separate speakers. A previous study of the neural mechanisms underlying the segregation of concurrent signals demonstrated that midbrain neurons temporally encode a beat's
dF through spike synchronization; however, spectral information about
at least one of the beat's components is also required for signal
segregation. Here we examine the encoding of spectral differences in
beat signals by midbrain neurons. The results show that, although the
spike rate responses of many neurons are sensitive to the spectral
composition of a beat, virtually all midbrain units can encode
information about differences in the spectral composition of beat
stimuli via their interspike intervals (ISIs) with an equal
distribution of ISI spectral sensitivity across the behaviorally relevant dFs. Together, temporal encoding in the midbrain of dF information through spike synchronization and of spectral information through ISI could permit the segregation of concurrent vocal
signals.
0022-3077/99
$5.00
Copyright © 1999 The American Physiological Society |
---|---|
ISSN: | 0022-3077 1522-1598 |
DOI: | 10.1152/jn.1999.81.2.552 |