LEARNING REPRESENTATIONS OF EEG SIGNALS WITH SELF-SUPERVISED LEARNING
Self-supervised learning (SSL) is used to leverage structure in unlabeled data, to learn representations of EEG signals. Two tasks based on temporal context prediction as well as contrastive predictive coding are applied to two clinically-relevant problems: EEG-based sleep staging and pathology dete...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Patent |
Sprache: | eng ; fre ; ger |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Self-supervised learning (SSL) is used to leverage structure in unlabeled data, to learn representations of EEG signals. Two tasks based on temporal context prediction as well as contrastive predictive coding are applied to two clinically-relevant problems: EEG-based sleep staging and pathology detection. Experiments are performed on two large public datasets with thousands of recordings and perform baseline comparisons with purely supervised and hand-engineered paradigms. |
---|