Thoughts of brain EEG signal-to-text conversion using weighted feature fusion-based Multiscale Dilated Adaptive DenseNet with Attention Mechanism

•To develop an efficient brain thoughts-to-text conversion model with the processing of EEG signals by utilizing the multi-scale deep learning architecture with an optimization strategy for recognizing the brain thoughts in the text format.•Perform weighted feature concatenation using EOWGMO to enha...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Biomedical signal processing and control 2023-09, Vol.86, p.105120, Article 105120
Hauptverfasser: Yang, Jing, Awais, Muhammad, Hossain, Md. Amzad, Lip Yee, Por, Haowei, Ma, Mehedi, Ibrahim M., Iskanderani, A.I.M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•To develop an efficient brain thoughts-to-text conversion model with the processing of EEG signals by utilizing the multi-scale deep learning architecture with an optimization strategy for recognizing the brain thoughts in the text format.•Perform weighted feature concatenation using EOWGMO to enhance text conversion performance. Optimize feature selection and weights for high accuracy with low computational overhead.•Integrate MDADenseNet-AM, an adaptive deep learning architecture, with multiscale dilated and attention mechanisms into DenseNet. Optimize parameters using EOWGMO for improved text conversion performance.•Develop hybrid heuristic EOWGMO for weight tuning and optimal feature selection in weighted feature fusion. Use EOWGMO to tune parameters (optimizer, epochs) in DenseNet for improved thoughts-to-text conversion performance.•Compare developed thought-to-text conversion model with existing techniques for experimental analysis, highlighting its efficiency. Individuals with visual inefficiencies or different abilities face difficulties using their hands to operate smartphones and computers, necessitating reliance on others to enter data. Such dependence may lead to security and privacy issues, especially when sensitive information is shared with helpers. To address this problem, we present Think2Type, an efficient Brain-Computer Interface (BCI) that enables users to translate their active intentions into text format based on Morse code. BCI leverages brain activity to facilitate interaction with computers, often captured via Electroencephalography (EEG). This work proposes an enhanced attention-based deep learning strategy to develop an efficient text conversion mechanism from EEG signals. We begin by collecting EEG signals from standard benchmark datasets and extracting spectral and statistical features in phase 1, concatenating them into concatenated feature set 1 (F1). In phase 2, we extract spatial and temporal features via a One-Dimensional Convolutional Neural Network (1DCNN) and a Recurrent Neural Network (RNN), respectively, concatenating them into concatenated feature set 2 (F2). Weighted feature fusion is performed on concatenated features F1 and F2, with the hybrid optimization algorithm Eurasian Oystercatcher Wild Geese Migration Optimization (EOWGMO) optimizing the weight for improved fusion efficiency. The text conversion phase utilizes the Multiscale Dilated Adaptive DenseNet with Attention Mechanism (MDADenseNet-AM) to obtain th
ISSN:1746-8094
1746-8108
DOI:10.1016/j.bspc.2023.105120