Speech, emotion and language: A neuroscientific exploration

While the emotional state of a person can be manifested in different ways such as facial expressions, gestures, movements and postures, recognition of emotion from speech has gathered much interest over others. This study attempts to understand different factors that influence Speech Emotion Recogni...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Journal of the Acoustical Society of America 2019-10, Vol.146 (4), p.2846-2846
Hauptverfasser: Sanyal, Shankha, Banerjee, Archi, Karmakar, Samir, Ghosh, Dipak
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:While the emotional state of a person can be manifested in different ways such as facial expressions, gestures, movements and postures, recognition of emotion from speech has gathered much interest over others. This study attempts to understand different factors that influence Speech Emotion Recognition (SER), while taking into account one of the most important parameter—language. We look to classify and compare four basic emotions—anger, happy, sad and neutral from speech segments of four different linguistic groups—Bengali, English, German and Spanish. Robust nonlinear feature like multifractal width was used to develop a language independent emotion classification model. EEG was done on different groups of L1 speaking participants to understand the neuro-cognitive appraisal corresponding to speech segments of different language, i.e., to understand the presence of any speech features that contribute exclusively to emotion recognition of a selected language. The nonlinear, non-stationary EEG time series has been analyzed with latest state of the art multifractal features. The first of its kind study is expected to shed light on how the emotional contents of speech depend on language or solely on the prosodic features of speech such as pitch, loudness, tempo, stress and rhythm.
ISSN:0001-4966
1520-8524
DOI:10.1121/1.5136872