Categorizing auditory objects of perceived sounds in the built environment
This exploratory study focuses on a cognitive approach to categorize complex auditory scenes in the built environment, while prior studies have concentrated on outdoor acoustic environments. Six experts and 30 non-experts performed a free chip sorting task to assess 70 binaural recordings taken from...
Gespeichert in:
Veröffentlicht in: | The Journal of the Acoustical Society of America 2022-04, Vol.151 (4), p.A51-A51 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This exploratory study focuses on a cognitive approach to categorize complex auditory scenes in the built environment, while prior studies have concentrated on outdoor acoustic environments. Six experts and 30 non-experts performed a free chip sorting task to assess 70 binaural recordings taken from indoor spaces. Healthcare, working, cultural, educational, leisure, worship, and transportation spaces (e.g., bus, train, metro stations, and airports) were chosen as public spaces. The participants were asked to classify the auditory objects into sound categories based on the descriptive labels provided by the authors to identify the sound sources. The findings, obtained through hierarchical agglomerative clustering and non-metric multidimensional scaling (MDS), show that there are three prominent category labels regarding perceived sound in the indoor acoustic environment: (1) intelligible and unintelligible speech; (2) periodic and transient sounds; and (3) stationary and non-stationary sounds. Human-generated sounds such as conversation, laughter, footsteps, and coughing vary over time according to the context of the built environment. Moreover, technology-related sounds, such as mechanical and electronic ones, have a deterministic and random nature that differ according to the function of the spaces. |
---|---|
ISSN: | 0001-4966 1520-8524 |
DOI: | 10.1121/10.0010629 |