EEG emotion recognition based on the attention mechanism and pre-trained convolution capsule network
Given the rapid development of brain–computer interfaces, emotion identification based on EEG signals has emerged as a new study area with tremendous importance in recent years. EEG-based emotion recognition remains a challenging task in pattern recognition due to the complexity and diversity of the...
Gespeichert in:
Veröffentlicht in: | Knowledge-based systems 2023-04, Vol.265, p.110372, Article 110372 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Given the rapid development of brain–computer interfaces, emotion identification based on EEG signals has emerged as a new study area with tremendous importance in recent years. EEG-based emotion recognition remains a challenging task in pattern recognition due to the complexity and diversity of the emotion signal, even though the deep learning models have significantly outperformed the conventional techniques in this area. In this paper, we propose an EEG emotion recognition model on the basis of the attention mechanism and a pre-trained convolution capsule network to recognize various emotions more effectively. This model employs coordinate attention to endow the input signal with relative spatial information and then maps the EEG signal to higher dimensional space, which enriches the emotion-related information in EEG. The pre-trained model, which excels at extracting features, is adopted as the feature extractor. Last but not least, a double-layer capsule network is constructed for emotion recognition to completely utilize the relative location information of EEG data. The subject-dependent and subject-independent experiments using DEAP datasets are individually conducted in this study, with the findings demonstrating that the suggested strategy presented excellent recognition accuracy and generalizability. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2023.110372 |