Decoding Behavior Tasks From Brain Activity Using Deep Transfer Learning

Recently, advances in noninvasive detection techniques have shown that it is possible to decode visual information from measurable brain activities. However, these studies typically focused on the mapping between neural activities and visual information, such as the image or video stimulus, on the i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019, Vol.7, p.43222-43232
Hauptverfasser: Gao, Yufei, Zhang, Yameng, Wang, Hailing, Guo, Xiaojuan, Zhang, Jiacai
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recently, advances in noninvasive detection techniques have shown that it is possible to decode visual information from measurable brain activities. However, these studies typically focused on the mapping between neural activities and visual information, such as the image or video stimulus, on the individual level. Here, the common decoding models across individuals that classifying behavior tasks from brain signals were investigated. We proposed a cross-subject decoding approach using deep transfer learning (DTL) to decipher the behavior tasks from functional magnetic resonance imaging (fMRI) recording during subjects performing different tasks. We connected parts of the state-of-the-art networks pre-trained on the ImageNet dataset to our defined adaption layers to classify the behavior tasks from fMRI data. Our experiments on the Human Connectome Project (HCP) dataset showed that the proposed method achieved a higher decoding accuracy across subjects than the previous studies. We also conducted an experiment on five subsets of HCP data, which further demonstrated that our DTL approach is more effective on small dataset than the traditional methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2907040