EEG Correlates of Voice and Face Emotional Judgments in the Human Brain

The purpose of this study is to clarify the neural correlates of human emotional judgment. This study aimed to induce a controlled perturbation in the emotional system of the brain by multimodal stimuli, and to investigate whether such emotional stimuli could induce reproducible and consistent chang...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Cognitive computation 2015-02, Vol.7 (1), p.11-19
Hauptverfasser: Hiyoshi-Taniguchi, K., Kawasaki, M., Yokota, T., Bakardjian, H., Fukuyama, H., Cichocki, A., Vialatte, F. B.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The purpose of this study is to clarify the neural correlates of human emotional judgment. This study aimed to induce a controlled perturbation in the emotional system of the brain by multimodal stimuli, and to investigate whether such emotional stimuli could induce reproducible and consistent changes in electroencephalography (EEG) signals. We exposed 12 subjects to auditory, visual, or combined audio–visual stimuli. Audio stimuli consisted of voice recordings of the Japanese word “ arigato ” (thank you) pronounced with three different intonations (angry—A, happy—H or neutral—N). Visual stimuli consisted of faces of women expressing the same emotional valences (A, H or N). Audio–visual stimuli were composed using either congruent combinations of faces and voices (e.g., H × H) or noncongruent combinations (e.g., A × H). The data were collected using an EEG system, and analysis was performed by computing the topographic distributions of EEG signals in the theta, alpha, and beta frequency ranges. We compared the conditions stimuli (A, H or N), and congruent versus noncongruent. Topographic maps of EEG power differed between those conditions. The obtained results demonstrate that EEG could be used as a tool to investigate emotional valence and discriminate various emotions.
ISSN:1866-9956
1866-9964
DOI:10.1007/s12559-013-9225-0