LEARNING REPRESENTATIONS OF EEG SIGNALS WITH SELF-SUPERVISED LEARNING

Self-supervised learning (SSL) is used to leverage structure in unlabeled data, to learn representations of EEG signals. Two tasks based on temporal context prediction as well as contrastive predictive coding are applied to two clinically-relevant problems: EEG-based sleep staging and pathology dete...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: AIMONE, Christopher Allen, WOOD, Sean Ulrich Niethe, JACOB BANVILLE, Hubert
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Self-supervised learning (SSL) is used to leverage structure in unlabeled data, to learn representations of EEG signals. Two tasks based on temporal context prediction as well as contrastive predictive coding are applied to two clinically-relevant problems: EEG-based sleep staging and pathology detection. Experiments are performed on two large public datasets with thousands of recordings and perform baseline comparisons with purely supervised and hand-engineered paradigms.