Emotion Recognition on EEG Signal Using ResNeXt Attention 2D-3D Convolution Neural Networks
Emotion recognition based on electroencephalogram (EEG) is an important part of human–machine interaction. This paper used deep learning methods to extract EEG data features to achieve the classification of human emotional states. We proposed a emotion recognition method based on two-dimensional con...
Gespeichert in:
Veröffentlicht in: | Neural processing letters 2023-10, Vol.55 (5), p.5943-5957 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Emotion recognition based on electroencephalogram (EEG) is an important part of human–machine interaction. This paper used deep learning methods to extract EEG data features to achieve the classification of human emotional states. We proposed a emotion recognition method based on two-dimensional convolution neural networks and three-dimensional convolution neural networks, called ResNeXt Attention 2D–3D Convolutional Neural Networks (RA2–3DCNN). The split-convert-merge techniques, residual and attention mechanism are introduced into the shallow network to improve the accuracy of the model. Then, 3D CNN was used to integrate the frequency, spatial and temporal information from EEG signal. Herein, the pre-processed EEG time series data was reconstructed into two-dimensional EEG frames as the input of the model according to the original electrode position. The accuracy of the emotional classification of the RA2–3DCNN was demonstrated by extensive experiments on the DEAP dataset. The results showed that the recognition accuracy of the method on arousal and valence classification task was 97.19% and 97.58%, respectively. Our results proved the spatio-temporal effectiveness of the method for emotion classification. In addition, we experimentally verified the optimal cardinality of split-convert-merge techniques in emotion recognition task. |
---|---|
ISSN: | 1370-4621 1573-773X |
DOI: | 10.1007/s11063-022-11120-0 |