Emotion Recognition From Multi-Channel EEG Signals by Exploiting the Deep Belief-Conditional Random Field Framework

Recently, much attention has been attracted to automatic emotion recognition based on multi-channel electroencephalogram (EEG) signals, with the rapid development of machine learning methods. However, traditional methods ignore the correlation information between different channels, and cannot fully...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2020, Vol.8, p.33002-33012
Hauptverfasser: Chao, Hao, Liu, Yongli
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recently, much attention has been attracted to automatic emotion recognition based on multi-channel electroencephalogram (EEG) signals, with the rapid development of machine learning methods. However, traditional methods ignore the correlation information between different channels, and cannot fully capture the long-term dependencies and contextual information of EEG signals. To address the problems, this paper proposes a deep belief-conditional random field (DBN-CRF) framework which integrates the improved deep belief networks with glia chains (DBN-GC) and conditional random field. In the framework, the raw feature vector sequence is firstly extracted from the multi-channel EEG signals by a sliding window. Then, parallel DBN-GC models are utilized to obtain the high-level feature sequence of the multi-channel EEG signals. And the conditional random field (CRF) model generates the predicted emotion label sequence according to the high-level feature sequence. Finally, the decision merge layer based on K-nearest neighbor algorithm is employed to estimate the emotion state. According to our best knowledge, this is the first attempt that applies the conditional random field methodology to deep belief networks for emotion recognition. Experiments are conducted on three publicly available emotional datasets which include AMIGOS, SEED and DEAP. The results demonstrate that the proposed framework can mine inter correlation information of multiple-channel by the glia chains and catch inter channel correlation information and contextual information of EEG signals for emotion recognition. In addition, the classification accuracy of the proposed method is compared with several classical techniques. The results indicate that the proposed method outperforms most of the other deep classifiers. Thus, potential of the proposed framework is demonstrated.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.2974009