MDTL: A Novel and Model-Agnostic Transfer Learning Strategy for Cross-Subject Motor Imagery BCI
In recent years, deep neural network-based transfer learning (TL) has shown outstanding performance in EEG-based motor imagery (MI) brain-computer interface (BCI). However, due to the long preparation for pre-trained models and the arbitrariness of source domain selection, using deep transfer learni...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on neural systems and rehabilitation engineering 2023-01, Vol.PP, p.1-1 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In recent years, deep neural network-based transfer learning (TL) has shown outstanding performance in EEG-based motor imagery (MI) brain-computer interface (BCI). However, due to the long preparation for pre-trained models and the arbitrariness of source domain selection, using deep transfer learning on different datasets and models is still challenging. In this paper, we proposed a multi-direction transfer learning (MDTL) strategy for cross-subject MI EEG-based BCI. This strategy utilizes data from multi-source domains to the target domain as well as from one multi-source domain to another multi-source domain. This strategy is model-independent so that it can be quickly deployed on existing models. Three generic deep learning models for MI classification (DeepConvNet, ShallowConvNet, and EEGNet) and two public motor imagery datasets (BCIC IV dataset 2a and Lee2019) are used in this study to verify the proposed strategy. For the four-classes dataset BCIC IV dataset 2a, the proposed MDTL achieves 80.86%, 81.95%, and 75.00% mean prediction accuracy using the three models, which outperforms those without MDTL by 5.79%, 6.64%, and 11.42%. For the binary-classes dataset Lee2019, MDTL achieves 88.2% mean accuracy using the model DeepConvNet. It outperforms the accuracy without MDTL by 23.48%. The achieved 81.95% and 88.2% are also better than the existing deep transfer learning strategy. Besides, the training time of MDTL is reduced by 93.94%. MDTL is an easy-to-deploy, scalable and reliable transfer learning strategy for existing deep learning models, which significantly improves model performance and reduces preparation time without changing model architecture. |
---|---|
ISSN: | 1534-4320 1558-0210 |
DOI: | 10.1109/TNSRE.2023.3259730 |