Design of Deep Learning Model for Task-Evoked fMRI Data Classification

Machine learning methods have been successfully applied to neuroimaging signals, one of which is to decode specific task states from functional magnetic resonance imaging (fMRI) data. In this paper, we propose a model that simultaneously utilizes characteristics of both spatial and temporal sequenti...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computational intelligence and neuroscience 2021, Vol.2021 (1), p.6660866-6660866
Hauptverfasser: Huang, Xiaojie, Xiao, Jun, Wu, Chao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 6660866
container_issue 1
container_start_page 6660866
container_title Computational intelligence and neuroscience
container_volume 2021
creator Huang, Xiaojie
Xiao, Jun
Wu, Chao
description Machine learning methods have been successfully applied to neuroimaging signals, one of which is to decode specific task states from functional magnetic resonance imaging (fMRI) data. In this paper, we propose a model that simultaneously utilizes characteristics of both spatial and temporal sequential information of fMRI data with deep neural networks to classify the fMRI task states. We designed a convolution network module and a recurrent network module to extract the spatial and temporal features of fMRI data, respectively. In particular, we also add the attention mechanism to the recurrent network module, which more effectively highlights the brain activation state at the moment of reaction. We evaluated the model using task-evoked fMRI data from the Human Connectome Project (HCP) dataset, the classification accuracy got 94.31%, and the experimental results have shown that the model can effectively distinguish the brain states under different task stimuli.
doi_str_mv 10.1155/2021/6660866
format Article
fullrecord <record><control><sourceid>gale_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_8378948</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A683574767</galeid><sourcerecordid>A683574767</sourcerecordid><originalsourceid>FETCH-LOGICAL-c481t-70220942c957bd75120999a3f79319a24a858eca013760310abff66687b4e9293</originalsourceid><addsrcrecordid>eNp9kV1rVDEQhoMotlbv_AEBb4R6bL4_boSy22phiyD1Osyek2zTnk3W5GzFf2_WXSp60auZYR7emZcXobeUfKRUyjNGGD1TShGj1DN0TJXRnWSaP3_slTxCr2q9I0RqSdhLdMSFYIxwcYwu577GVcI54Ln3G7zwUFJMK3ydBz_ikAu-gXrfXTzkez_gcP3tCs9hAjwbodYYYg9TzOk1ehFgrP7NoZ6g75cXN7Mv3eLr56vZ-aLrhaFTp0k7awXrrdTLQUvaJmuBB205tcAEGGl8D4RyrQinBJYhNG9GL4W3zPIT9Gmvu9ku137ofZoKjG5T4hrKL5chun83Kd66VX5whmtjhWkC7w8CJf_Y-jq5day9H0dIPm-rY1JxTQU3tKHv_kPv8rakZu8PxZVoP_-lVjB6F1PI7W6_E3XnyrbPFSHqacpwqYVWulEf9lRfcq3Fh0djlLhd2G4XtjuE3fDTPX4b0wA_49P0b9xMohc</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2563364319</pqid></control><display><type>article</type><title>Design of Deep Learning Model for Task-Evoked fMRI Data Classification</title><source>Wiley Online Library Open Access</source><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><source>Alma/SFX Local Collection</source><source>PubMed Central Open Access</source><creator>Huang, Xiaojie ; Xiao, Jun ; Wu, Chao</creator><contributor>Dourado, António ; António Dourado</contributor><creatorcontrib>Huang, Xiaojie ; Xiao, Jun ; Wu, Chao ; Dourado, António ; António Dourado</creatorcontrib><description>Machine learning methods have been successfully applied to neuroimaging signals, one of which is to decode specific task states from functional magnetic resonance imaging (fMRI) data. In this paper, we propose a model that simultaneously utilizes characteristics of both spatial and temporal sequential information of fMRI data with deep neural networks to classify the fMRI task states. We designed a convolution network module and a recurrent network module to extract the spatial and temporal features of fMRI data, respectively. In particular, we also add the attention mechanism to the recurrent network module, which more effectively highlights the brain activation state at the moment of reaction. We evaluated the model using task-evoked fMRI data from the Human Connectome Project (HCP) dataset, the classification accuracy got 94.31%, and the experimental results have shown that the model can effectively distinguish the brain states under different task stimuli.</description><identifier>ISSN: 1687-5265</identifier><identifier>EISSN: 1687-5273</identifier><identifier>DOI: 10.1155/2021/6660866</identifier><identifier>PMID: 34422034</identifier><language>eng</language><publisher>New York: Hindawi</publisher><subject>Artificial neural networks ; Brain ; Brain mapping ; Brain research ; Cable television broadcasting industry ; Classification ; Datasets ; Deep learning ; Feature extraction ; Functional magnetic resonance imaging ; Learning algorithms ; Machine learning ; Magnetic resonance imaging ; Medical imaging ; Modules ; Neural networks ; Neuroimaging ; Temporal variations</subject><ispartof>Computational intelligence and neuroscience, 2021, Vol.2021 (1), p.6660866-6660866</ispartof><rights>Copyright © 2021 Xiaojie Huang et al.</rights><rights>COPYRIGHT 2021 John Wiley &amp; Sons, Inc.</rights><rights>Copyright © 2021 Xiaojie Huang et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><rights>Copyright © 2021 Xiaojie Huang et al. 2021</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c481t-70220942c957bd75120999a3f79319a24a858eca013760310abff66687b4e9293</citedby><cites>FETCH-LOGICAL-c481t-70220942c957bd75120999a3f79319a24a858eca013760310abff66687b4e9293</cites><orcidid>0000-0003-0303-134X ; 0000-0003-0885-6869</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8378948/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8378948/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,881,4010,27900,27901,27902,53766,53768</link.rule.ids></links><search><contributor>Dourado, António</contributor><contributor>António Dourado</contributor><creatorcontrib>Huang, Xiaojie</creatorcontrib><creatorcontrib>Xiao, Jun</creatorcontrib><creatorcontrib>Wu, Chao</creatorcontrib><title>Design of Deep Learning Model for Task-Evoked fMRI Data Classification</title><title>Computational intelligence and neuroscience</title><description>Machine learning methods have been successfully applied to neuroimaging signals, one of which is to decode specific task states from functional magnetic resonance imaging (fMRI) data. In this paper, we propose a model that simultaneously utilizes characteristics of both spatial and temporal sequential information of fMRI data with deep neural networks to classify the fMRI task states. We designed a convolution network module and a recurrent network module to extract the spatial and temporal features of fMRI data, respectively. In particular, we also add the attention mechanism to the recurrent network module, which more effectively highlights the brain activation state at the moment of reaction. We evaluated the model using task-evoked fMRI data from the Human Connectome Project (HCP) dataset, the classification accuracy got 94.31%, and the experimental results have shown that the model can effectively distinguish the brain states under different task stimuli.</description><subject>Artificial neural networks</subject><subject>Brain</subject><subject>Brain mapping</subject><subject>Brain research</subject><subject>Cable television broadcasting industry</subject><subject>Classification</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Feature extraction</subject><subject>Functional magnetic resonance imaging</subject><subject>Learning algorithms</subject><subject>Machine learning</subject><subject>Magnetic resonance imaging</subject><subject>Medical imaging</subject><subject>Modules</subject><subject>Neural networks</subject><subject>Neuroimaging</subject><subject>Temporal variations</subject><issn>1687-5265</issn><issn>1687-5273</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RHX</sourceid><sourceid>BENPR</sourceid><recordid>eNp9kV1rVDEQhoMotlbv_AEBb4R6bL4_boSy22phiyD1Osyek2zTnk3W5GzFf2_WXSp60auZYR7emZcXobeUfKRUyjNGGD1TShGj1DN0TJXRnWSaP3_slTxCr2q9I0RqSdhLdMSFYIxwcYwu577GVcI54Ln3G7zwUFJMK3ydBz_ikAu-gXrfXTzkez_gcP3tCs9hAjwbodYYYg9TzOk1ehFgrP7NoZ6g75cXN7Mv3eLr56vZ-aLrhaFTp0k7awXrrdTLQUvaJmuBB205tcAEGGl8D4RyrQinBJYhNG9GL4W3zPIT9Gmvu9ku137ofZoKjG5T4hrKL5chun83Kd66VX5whmtjhWkC7w8CJf_Y-jq5day9H0dIPm-rY1JxTQU3tKHv_kPv8rakZu8PxZVoP_-lVjB6F1PI7W6_E3XnyrbPFSHqacpwqYVWulEf9lRfcq3Fh0djlLhd2G4XtjuE3fDTPX4b0wA_49P0b9xMohc</recordid><startdate>2021</startdate><enddate>2021</enddate><creator>Huang, Xiaojie</creator><creator>Xiao, Jun</creator><creator>Wu, Chao</creator><general>Hindawi</general><general>John Wiley &amp; Sons, Inc</general><general>Hindawi Limited</general><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QF</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>7X7</scope><scope>7XB</scope><scope>8AL</scope><scope>8BQ</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>CWDGH</scope><scope>DWQXO</scope><scope>F28</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H8D</scope><scope>H8G</scope><scope>HCIFZ</scope><scope>JG9</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KR7</scope><scope>L6V</scope><scope>L7M</scope><scope>LK8</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0003-0303-134X</orcidid><orcidid>https://orcid.org/0000-0003-0885-6869</orcidid></search><sort><creationdate>2021</creationdate><title>Design of Deep Learning Model for Task-Evoked fMRI Data Classification</title><author>Huang, Xiaojie ; Xiao, Jun ; Wu, Chao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c481t-70220942c957bd75120999a3f79319a24a858eca013760310abff66687b4e9293</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Artificial neural networks</topic><topic>Brain</topic><topic>Brain mapping</topic><topic>Brain research</topic><topic>Cable television broadcasting industry</topic><topic>Classification</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Feature extraction</topic><topic>Functional magnetic resonance imaging</topic><topic>Learning algorithms</topic><topic>Machine learning</topic><topic>Magnetic resonance imaging</topic><topic>Medical imaging</topic><topic>Modules</topic><topic>Neural networks</topic><topic>Neuroimaging</topic><topic>Temporal variations</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Huang, Xiaojie</creatorcontrib><creatorcontrib>Xiao, Jun</creatorcontrib><creatorcontrib>Wu, Chao</creatorcontrib><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Aluminium Industry Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Computing Database (Alumni Edition)</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>Middle East &amp; Africa Database</collection><collection>ProQuest Central Korea</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Aerospace Database</collection><collection>Copper Technical Reference Library</collection><collection>SciTech Premium Collection</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>ProQuest Biological Science Collection</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Computational intelligence and neuroscience</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Huang, Xiaojie</au><au>Xiao, Jun</au><au>Wu, Chao</au><au>Dourado, António</au><au>António Dourado</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Design of Deep Learning Model for Task-Evoked fMRI Data Classification</atitle><jtitle>Computational intelligence and neuroscience</jtitle><date>2021</date><risdate>2021</risdate><volume>2021</volume><issue>1</issue><spage>6660866</spage><epage>6660866</epage><pages>6660866-6660866</pages><issn>1687-5265</issn><eissn>1687-5273</eissn><abstract>Machine learning methods have been successfully applied to neuroimaging signals, one of which is to decode specific task states from functional magnetic resonance imaging (fMRI) data. In this paper, we propose a model that simultaneously utilizes characteristics of both spatial and temporal sequential information of fMRI data with deep neural networks to classify the fMRI task states. We designed a convolution network module and a recurrent network module to extract the spatial and temporal features of fMRI data, respectively. In particular, we also add the attention mechanism to the recurrent network module, which more effectively highlights the brain activation state at the moment of reaction. We evaluated the model using task-evoked fMRI data from the Human Connectome Project (HCP) dataset, the classification accuracy got 94.31%, and the experimental results have shown that the model can effectively distinguish the brain states under different task stimuli.</abstract><cop>New York</cop><pub>Hindawi</pub><pmid>34422034</pmid><doi>10.1155/2021/6660866</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0003-0303-134X</orcidid><orcidid>https://orcid.org/0000-0003-0885-6869</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1687-5265
ispartof Computational intelligence and neuroscience, 2021, Vol.2021 (1), p.6660866-6660866
issn 1687-5265
1687-5273
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_8378948
source Wiley Online Library Open Access; EZB-FREE-00999 freely available EZB journals; PubMed Central; Alma/SFX Local Collection; PubMed Central Open Access
subjects Artificial neural networks
Brain
Brain mapping
Brain research
Cable television broadcasting industry
Classification
Datasets
Deep learning
Feature extraction
Functional magnetic resonance imaging
Learning algorithms
Machine learning
Magnetic resonance imaging
Medical imaging
Modules
Neural networks
Neuroimaging
Temporal variations
title Design of Deep Learning Model for Task-Evoked fMRI Data Classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T16%3A04%3A19IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Design%20of%20Deep%20Learning%20Model%20for%20Task-Evoked%20fMRI%20Data%20Classification&rft.jtitle=Computational%20intelligence%20and%20neuroscience&rft.au=Huang,%20Xiaojie&rft.date=2021&rft.volume=2021&rft.issue=1&rft.spage=6660866&rft.epage=6660866&rft.pages=6660866-6660866&rft.issn=1687-5265&rft.eissn=1687-5273&rft_id=info:doi/10.1155/2021/6660866&rft_dat=%3Cgale_pubme%3EA683574767%3C/gale_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2563364319&rft_id=info:pmid/34422034&rft_galeid=A683574767&rfr_iscdi=true