Self-Weighted Semi-Supervised Classification for Joint EEG-Based Emotion Recognition and Affective Activation Patterns Mining

In electroencephalography (EEG)-based affective brain-computer interfaces (aBCIs), there is a consensus that EEG features extracted from different frequency bands and channels have different abilities in emotion expression. Besides, EEG is so weak and non-stationary that easily causes distribution d...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on instrumentation and measurement 2021, Vol.70, p.1-11
Hauptverfasser: Peng, Yong, Kong, Wanzeng, Qin, Feiwei, Nie, Feiping, Fang, Jinglong, Lu, Bao-Liang, Cichocki, Andrzej
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In electroencephalography (EEG)-based affective brain-computer interfaces (aBCIs), there is a consensus that EEG features extracted from different frequency bands and channels have different abilities in emotion expression. Besides, EEG is so weak and non-stationary that easily causes distribution discrepancies for EEG data collected at different times; therefore, it is necessary to explore the affective activation patterns in cross-session emotion recognition. To address these two problems, we propose a self-weighted semi-supervised classification (SWSC) model in this article for joint EEG-based cross-session emotion recognition and affective activation patterns mining, whose merits include: 1) using both the labeled and unlabeled samples from different sessions for better capturing data characteristics; 2) introducing a self-weighted variable to learn the importance of EEG features adaptively and quantitatively; and 3) mining the activation patterns including the critical EEG frequency bands and channels automatically based on the learned self-weighted variable. Extensive experiments are conducted on the benchmark SEED_IV emotional dataset and SWSC obtained excellent average accuracies of 77.40%, 79.55%, and 81.52% in three cross-session emotion recognition tasks. Moreover, SWSC identifies that the Gamma frequency band contributes the most and the EEG channels in prefrontal, left/right temporal, and (central) parietal lobes are more important for cross-session emotion recognition.
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2021.3124056