A novel EEG-based approach to classify emotions through phase space dynamics

Emotion recognition has drawn a great deal of attention in brain computer interface field. Researches have come to the conclusion that nonlinear features are more successful in emotion classification because of non-stationary and chaotic behavior of biological signals, especially electroencephalogra...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Signal, image and video processing image and video processing, 2019-09, Vol.13 (6), p.1149-1156
Hauptverfasser: Zangeneh Soroush, Morteza, Maghooli, Keivan, Setarehdan, Seyed Kamaledin, Nasrabadi, Ali Motie
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Emotion recognition has drawn a great deal of attention in brain computer interface field. Researches have come to the conclusion that nonlinear features are more successful in emotion classification because of non-stationary and chaotic behavior of biological signals, especially electroencephalogram (EEG). In this study, a new method based on phase space dynamics and Poincare sections is introduced to classify emotions. This transformation quantifies the phase space and represents its characteristics in a new state space which can simply reflect the signal behavior. Not only are the proposed state space and quantifiers effective to appropriately describe nonlinear signal dynamics, but also this method successfully works in real-world applications like EEG-based emotion recognition. In this work, EEGs are transformed into the new state space and Poincare intersections are considered as features with the aim of classifying EEGs into four emotional groups based on arousal and valence values. A reliable dataset is used, and classification accuracy is about 82%. As results suggest, the proposed method and features are efficient in this task in comparison with previous studies employing the same dataset.
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-019-01455-y