Your attention, please! Determining saliency of competing audio stimuli in natural scenarios
Perceptual saliency is a precursor to bottom-up attention modeling. While visual saliency models are approaching maturity, auditory models remain in their infancy. This is mainly due to the lack of robust methods to gather basic data, and oversimplifications such as an assumption of monaural signals...
Gespeichert in:
Veröffentlicht in: | The Journal of the Acoustical Society of America 2013-05, Vol.133 (5_Supplement), p.3354-3354 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Perceptual saliency is a precursor to bottom-up attention modeling. While visual saliency models are approaching maturity, auditory models remain in their infancy. This is mainly due to the lack of robust methods to gather basic data, and oversimplifications such as an assumption of monaural signals. Here we present the rationale and initial results of a newly designed experimental paradigm, testing for auditory saliency of natural sounds in a binaural listening scenario. Our main goal is to explore the idea that the saliency of a sound depends on its relation to background sounds by using more than one sound at a time, presented against different backgrounds. An analysis of the relevant, emerging acoustical correlates together with other descriptors is performed. A review of current auditory saliency models and the deficiencies of conventional testing approaches are provided. These motivate the development of our experimental test bed and more formalized stimulus selection criteria to support more versatile and ecologically relevant saliency models. Applications for auditory scene analysis and sound synthesis are briefly discussed. Some initial conclusions are drawn about the definition of an expanded feature set to be used for auditory saliency modeling and prediction in the context of natural, everyday sounds. |
---|---|
ISSN: | 0001-4966 1520-8524 |
DOI: | 10.1121/1.4805703 |