Time-Series Generative Adversarial Network Approach of Deep Learning Improves Seizure Detection From the Human Thalamic SEEG
Seizure detection algorithms are often optimized to detect seizures from the epileptogenic cortex. However, in non-localizable epilepsies, the thalamus is frequently targeted for neuromodulation. Developing a reliable seizure detection algorithm from thalamic SEEG may facilitate the translation of c...
Gespeichert in:
Veröffentlicht in: | Frontiers in neurology 2022-02, Vol.13, p.755094-755094 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Seizure detection algorithms are often optimized to detect seizures from the epileptogenic cortex. However, in non-localizable epilepsies, the thalamus is frequently targeted for neuromodulation. Developing a reliable seizure detection algorithm from thalamic SEEG may facilitate the translation of closed-loop neuromodulation. Deep learning algorithms promise reliable seizure detectors, but the major impediment is the lack of larger samples of curated ictal thalamic SEEG needed for training classifiers. We aimed to investigate if synthetic data generated by temporal Generative Adversarial Networks (TGAN) can inflate the sample size to improve the performance of a deep learning classifier of ictal and interictal states from limited samples of thalamic SEEG. Thalamic SEEG from 13 patients (84 seizures) was obtained during stereo EEG evaluation for epilepsy surgery. Overall, TGAN generated synthetic data augmented the performance of the bidirectional Long-Short Term Memory (BiLSTM) performance in classifying thalamic ictal and baseline states. Adding synthetic data improved the accuracy of the detection model by 18.5%. Importantly, this approach can be applied to classify electrographic seizure onset patterns or develop patient-specific seizure detectors from implanted neuromodulation devices. |
---|---|
ISSN: | 1664-2295 1664-2295 |
DOI: | 10.3389/fneur.2022.755094 |