EEG emotion recognition model based on the LIBSVM classifier
•Electroencephalogram-based emotion recognition has better reliability.•Two-category classifications are performed on Arousal and Valance respectively.•Different types of features are selected and combined.•Different combinations of brain channels are explored. This paper proposes an electroencephal...
Gespeichert in:
Veröffentlicht in: | Measurement : journal of the International Measurement Confederation 2020-11, Vol.164, p.108047, Article 108047 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •Electroencephalogram-based emotion recognition has better reliability.•Two-category classifications are performed on Arousal and Valance respectively.•Different types of features are selected and combined.•Different combinations of brain channels are explored.
This paper proposes an electroencephalogram(EEG) emotion recognition method based on the LIBSVM classifier. EEG features are calculated to represent the characterisitics associated with emotion states. First calculating the Lempel–Ziv complexity and wavelet detail coefficients for the pre-processed EEG signals; then obtaining the co-integration relationship that reflecting the relationship between channels according to the cointegration test; next performing Empirical Mode Decomposition(EMD) on the pre-processed EEG signals; finally calculating the average approximate entropy of the first four Intrinsic Mode Functions(IMFs). The calculated four features are input into the LIBSVM classifier to realize the sentiment classification of each channel data, and then the classification results of each channel are fused by the Takagi–Sugeno fuzzy model to achieve the final emotion classification. The experimental results show that when the two-category classifications are performed on Arousal and Valance, the average sentiment recognition rates are 74.88% and 82.63%, respectively. |
---|---|
ISSN: | 0263-2241 1873-412X |
DOI: | 10.1016/j.measurement.2020.108047 |