Transferrable Subject-Independent Feature Representation for Discriminating EEG-Based Brain Signals
Subject-independent Electroencephalography (EEG) recognition remains challenging due to inherent variability of brain anatomy across different subjects. Such variability is further complicated by the Volume Conduction Effect (VCE) that introduces channel-interference noise, exacerbating subject-spec...
Gespeichert in:
Veröffentlicht in: | Guidance, Navigation and Control Navigation and Control, 2024-08, Vol.4 (3) |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Subject-independent Electroencephalography (EEG) recognition remains challenging due to inherent variability of brain anatomy across different subjects. Such variability is further complicated by the Volume Conduction Effect (VCE) that introduces channel-interference noise, exacerbating subject-specific biases in the recorded EEG signals. Existing studies, often relying large datasets and entangled spatial-temporal features, struggle to overcome this bias, particularly in scenarios with limited EEG data. To this end, we propose a Temporal-Connective EEG Representation Learning (TCERL) framework that disentangles temporal and spatial feature learning. TCERL first employs an one-dimensional convolutional network to extract channel-specific temporal features, mitigating channel-interference noise caused by VCE. Building upon these temporal features, TCERL then leverages Graph Neural Networks to extract subject-invariant topological features from a functional brain network, constructed using the channel-specific features as nodes and functional connectivity as the adjacency matrix. This approach allows TCERL to capture robust representations of brain activity patterns that generalize well across different subjects. Our empirical experiment demonstrates that TCERL outperforms state-of-the-art across a range of training subjects on four public benchmarks and is less sensitive to subject variability. The performance gain is highlighted when limited subjects are available, suggesting the robustness and transferability of the proposed method. Source code can be found in:
https://github.com/haoweilou/TCERL
. |
---|---|
ISSN: | 2737-4807 2737-4920 |
DOI: | 10.1142/S273748072441005X |