EEG signal classification method based on fractal features and neural network

In this paper, we propose a method to classify electroencephalogram (EEG) signal recorded from left- and right-hand movement imaginations. Three subjects (two males and one female) are volunteered to participate in the experiment. We use a technique of complexity measure based on fractal analysis to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society 2008-01, Vol.2008, p.3880-3883
Hauptverfasser: Phothisonothai, Montri, Nakagawa, Masahiro
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we propose a method to classify electroencephalogram (EEG) signal recorded from left- and right-hand movement imaginations. Three subjects (two males and one female) are volunteered to participate in the experiment. We use a technique of complexity measure based on fractal analysis to reveal feature patterns in the EEG signal. Effective algorithm, namely, detrended fluctuation analysis (DFA) has been selected to estimate embedded fractal dimension (FD) values between relaxing and imaging states of the recorded EEG signal. To show the waveform of FDs, we use a windowing-based method or called time-dependent fractal dimension (TDFD) and the Kullback-Leibler (K-L) divergence. Two feature parameters; K-L divergence and different expected values are proposed to be input variables of the classifier. Finally, featured data are classified by a three-layer feed-forward neural network based on a simple backpropagation algorithm. Experimental results can be considerably applied in a brain-computer interface (BCI) application and show that the proposed method is more effective than the conventional method by improving average classification rates of 87.5% and 88.3% for left- and right-hand movement imagery tasks, respectively.
ISSN:1094-687X
1557-170X
1558-4615
DOI:10.1109/IEMBS.2008.4650057