NIRS Data Augmentation Technique to Detect Hemodynamic Peaks during Self-Paced Motor Imagery
Optical brain monitoring, such as near-infrared spectroscopy (NIRS), has facilitated numerous brain studies, including those based on machine learning techniques. A large and diverse dataset is necessary for training machine learning algorithms to avoid overfitting a limited amount of data. However,...
Gespeichert in:
Veröffentlicht in: | IEEE access 2023-01, Vol.11, p.1-1 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Optical brain monitoring, such as near-infrared spectroscopy (NIRS), has facilitated numerous brain studies, including those based on machine learning techniques. A large and diverse dataset is necessary for training machine learning algorithms to avoid overfitting a limited amount of data. However, recruiting sufficient subjects is challenging owing to time and budget constraints. Therefore, we propose an NIRS data generation algorithm that scales NIRS signal components, such as hemodynamic response function, physiological noise, and system spike noise, based on the source-detector distance to augment the training data. The experimental data were augmented with the generated NIRS data to train a convolutional neural network to classify self-paced left- and right-hand motor imagery. Augmenting the training dataset with 1000 generated data points increased the classification accuracy to 86.3 ± 4.1%, indicating a 26% increase compared with training on experimental data only. In addition, we applied Guided Gradient-weighted Class Activation Mapping (Grad-CAM) to visualize the class discriminative features of the input data. The Guided Grad-CAM heatmaps aligned well with the oxy-hemoglobin peaks during self-paced motor imagery. We concluded that the increased cerebral oxygenation, especially in the contralateral hemisphere, was the class-discriminative feature for classifying left- and right-hand motor imagery. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2023.3263489 |