Sonar processing by the spectrogram correlation and transformation model of biosonar

Echolocating big brown bats emit frequency-modulated (FM) biosonar sounds and perceive target range from echo delays through spectrogram correlation (SC) and target shape from interference nulls in echo spectra through spectrogram transformation (ST). Combined, the SCAT model is a computationally un...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Journal of the Acoustical Society of America 2017-05, Vol.141 (5), p.3486-3486
Hauptverfasser: Haro, Stephanie, Simmons, James A., Gaudette, Jason E.
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Echolocating big brown bats emit frequency-modulated (FM) biosonar sounds and perceive target range from echo delays through spectrogram correlation (SC) and target shape from interference nulls in echo spectra through spectrogram transformation (ST). Combined, the SCAT model is a computationally unified auditory description of biosonar as a real-time process. We developed a Matlab implementation of SCAT and tested it with a succession of simulated bat-like FM signals (chirps), each followed by one or more FM echoes that have realistic delay and spectral characteristics. The model simulates neural response latencies in frequency-tuned delay-lines that use coincidence detections for target ranging by SC. For ST, a novel, deconvolution-like network transforms echo spectra into images of the target’s glints by detecting coincidences between spikes that represent spectral nulls in parallel channels tuned to null frequencies. Experiments show that dolphins likely separate ST into two operations—MaPS for short glint separations (macro power spectral features, 80 µs). The ST deconvolution network models MiPS. The highly distributed character of the model favors real-time operation, an important goal for bioinspired sonar development. [Work supported by ONR.]
ISSN:0001-4966
1520-8524
DOI:10.1121/1.4987268