Analysis of motor imagery data from EEG device to move prosthetic hands by using deep learning classification

Controlling the artificial hand using the mind is a dream for many people who had lost their limbs. Brain-Computer Interface (BCI) technology is hoped in making these things happen by connecting commands and responses to the brain as information in the control system. However, the complexity of the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Saragih, Agung Shamsuddin, Basyiri, Hadyan Nasran, Raihan, Muhammad Yusuf
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Controlling the artificial hand using the mind is a dream for many people who had lost their limbs. Brain-Computer Interface (BCI) technology is hoped in making these things happen by connecting commands and responses to the brain as information in the control system. However, the complexity of the EEG signal becomes a challenge in realizing. The use of a deep learning-based classification model is expected to be a solution for classifying the hand movements imagined by the user as an input to the electric artificial hand control system. The main aim of this study is to classify EEG signals from the human brain in real-time using a non-invasive EEG headset for two different hand operations: rest and grip. OpenBCI Ultracortex Mark IV Headset was used in this study. This study proposes a solution for the classification of rest and grip hand movement by exploiting a Long Short-Term Memory (LSTM) network and Convolutional Neural Network (CNN) to learn the electroencephalogram (EEG) time-series information. EEG signals were recorded from 1 healthy subject via brain waves at specific locations on the scalp, at points F3, Fz, F4, FC1, FC2, C3, CZ, C3. A wide range of time-domain features are extracted from the EEG signals and used to train an LSTM and CNN to perform the classification task. This headset can capture brain waves that include artefacts such as limb movement, heartbeat, blink, and many more. Raw EEG from the headset was processed for event detection. Raw EEG from the headset was filtered using Butterworth bandpass filtering to separate the signal data into a new dataset containing alpha, beta, and both ranges. The results of this study indicate that the classification model using the CNN technique for the classification of two types of hand movements is able to achieve an accuracy of 95.45% at the highest, while the LSTM technique can achieve an accuracy of 93.64 %. Detected events were then used to trigger control signals to a prosthetic hand controlled by microcontroller.
ISSN:0094-243X
1551-7616
DOI:10.1063/5.0098178