RETRACTED ARTICLE: A novel multi-layermulti-spiking neural network for EEG signal classification using Mini BatchSGD

A novel multi-layer multi-spiking neural network (MMSNN) model sends information from one neuron to the next through multiple synapses in different spikes. When a suitable training method is unavailable, there are numerous complications in training spike neural networks. The convergence rate is slow...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Soft computing (Berlin, Germany) Germany), 2023-07, Vol.27 (14), p.9877-9890
Hauptverfasser: Ramesh, M, Revoori, Swetha, Edla, Damodar Reddy, Kiran, K. V. D
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A novel multi-layer multi-spiking neural network (MMSNN) model sends information from one neuron to the next through multiple synapses in different spikes. When a suitable training method is unavailable, there are numerous complications in training spike neural networks. The convergence rate is slow when gradient descent algorithms are trained on multi-spike neural networks. Meanwhile, the spiking network is likely to perform poorly in classifying electroencephalographic (EEG) datasets due to its high propensity to fall within the local minima. With the help of a 16-channel electrode, the Concealed Information Test created the EEG dataset. The Mini Batch Stochastic Gradient Descent (MBSGD) built the batch size and cut the number of iterations to speed up the convergence rate. It can, however, avoid the problem of moving into the local minimum more effectively. This paper describes an MBSGD algorithm for MMSNN model training that addresses local minima and convergence rate issues. This paper presents an MBSGD-MMSNN model for classifying EEG data. The MBSGD algorithm adjusts the weight parameters of the MMSNN model based on the batch size until the error is less than the required value. As a result, we achieve a faster convergence rate and avoid issues with local minima. The performance of the proposed MBSGD-MMSNN model is compared to that of existing meta-heuristic algorithms. The proposed MBSGD-MMSNN model outperformed other state-of-the-art algorithms regarding the accuracy and learning performance.
ISSN:1432-7643
1433-7479
DOI:10.1007/s00500-023-08404-5