A Convolutional Spiking Network for Gesture Recognition in Brain-Computer Interfaces
Brain-computer interfaces are being explored for a wide variety of therapeutic applications. Typically, this involves measuring and analyzing continuous-time electrical brain activity via techniques such as electrocorticogram (ECoG) or electroencephalography (EEG) to drive external devices. However,...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Brain-computer interfaces are being explored for a wide variety of
therapeutic applications. Typically, this involves measuring and analyzing
continuous-time electrical brain activity via techniques such as
electrocorticogram (ECoG) or electroencephalography (EEG) to drive external
devices. However, due to the inherent noise and variability in the
measurements, the analysis of these signals is challenging and requires offline
processing with significant computational resources. In this paper, we propose
a simple yet efficient machine learning-based approach for the exemplary
problem of hand gesture classification based on brain signals. We use a hybrid
machine learning approach that uses a convolutional spiking neural network
employing a bio-inspired event-driven synaptic plasticity rule for unsupervised
feature learning of the measured analog signals encoded in the spike domain. We
demonstrate that this approach generalizes to different subjects with both EEG
and ECoG data and achieves superior accuracy in the range of 92.74-97.07% in
identifying different hand gesture classes and motor imagery tasks. |
---|---|
DOI: | 10.48550/arxiv.2304.11106 |