A Hierarchical Bidirectional GRU Model With Attention for EEG-Based Emotion Classification

In this paper, we propose a hierarchical bidirectional Gated Recurrent Unit (GRU) network with attention for human emotion classification from continues electroencephalogram (EEG) signals. The structure of the model mirrors the hierarchical structure of EEG signals, and the attention mechanism is us...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019, Vol.7, p.118530-118540
Hauptverfasser: Chen, J. X., Jiang, D. M., Zhang, Y. N.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we propose a hierarchical bidirectional Gated Recurrent Unit (GRU) network with attention for human emotion classification from continues electroencephalogram (EEG) signals. The structure of the model mirrors the hierarchical structure of EEG signals, and the attention mechanism is used at two levels of EEG samples and epochs. By paying different levels of attention to content with different importance, the model can learn more significant feature representation of EEG sequence which highlights the contribution of important samples and epochs to its emotional categories. We conduct the cross-subject emotion classification experiments on DEAP data set to evaluate the model performance. The experimental results show that in valence and arousal dimensions, our model on 1-s segmented EEG sequences outperforms the best deep baseline LSTM model by 4.2% and 4.6%, and outperforms the best shallow baseline model by 11.7% and 12% respectively. Moreover, with increase of the epoch's length of EEG sequences, our model shows more robust classification performance than baseline models, which demonstrates that the proposed model can effectively reduce the impact of long-term non-stationarity of EEG sequences and improve the accuracy and robustness of EEG-based emotion classification.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2936817