Dynamic Brain Transformer with Multi-level Attention for Functional Brain Network Analysis

Recent neuroimaging studies have highlighted the importance of network-centric brain analysis, particularly with functional magnetic resonance imaging. The emergence of Deep Neural Networks has fostered a substantial interest in predicting clinical outcomes and categorizing individuals based on brai...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Kan, Xuan, Gu, Antonio Aodong Chen, Cui, Hejie, Guo, Ying, Yang, Carl
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Kan, Xuan
Gu, Antonio Aodong Chen
Cui, Hejie
Guo, Ying
Yang, Carl
description Recent neuroimaging studies have highlighted the importance of network-centric brain analysis, particularly with functional magnetic resonance imaging. The emergence of Deep Neural Networks has fostered a substantial interest in predicting clinical outcomes and categorizing individuals based on brain networks. However, the conventional approach involving static brain network analysis offers limited potential in capturing the dynamism of brain function. Although recent studies have attempted to harness dynamic brain networks, their high dimensionality and complexity present substantial challenges. This paper proposes a novel methodology, Dynamic bRAin Transformer (DART), which combines static and dynamic brain networks for more effective and nuanced brain function analysis. Our model uses the static brain network as a baseline, integrating dynamic brain networks to enhance performance against traditional methods. We innovatively employ attention mechanisms, enhancing model explainability and exploiting the dynamic brain network's temporal variations. The proposed approach offers a robust solution to the low signal-to-noise ratio of blood-oxygen-level-dependent signals, a recurring issue in direct DNN modeling. It also provides valuable insights into which brain circuits or dynamic networks contribute more to final predictions. As such, DRAT shows a promising direction in neuroimaging studies, contributing to the comprehensive understanding of brain organization and the role of neural circuits.
doi_str_mv 10.48550/arxiv.2309.01941
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2309_01941</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2309_01941</sourcerecordid><originalsourceid>FETCH-LOGICAL-a671-71859b83037ceacc29d205f9e4b383e8bc61e28a8272d44c64a59f4eadcafe993</originalsourceid><addsrcrecordid>eNotj8tOwzAURL1hgQofwAr_QIJfSexlKBSQCmyy6ia6cW5Ui8RBjtuSv6ev1WhGMyMdQh44S5XOMvYE4c_tUyGZSRk3it-SzcvsYXCWPgdwnlYB_NSNYcBADy5u6eeujy7pcY89LWNEH93o6bFBVztvTwb66_YL42EMP7Q8RvPkpjty00E_4f1VF6RavVbL92T9_faxLNcJ5AVPCq4z02jJZGERrBWmFSzrDKpGaom6sTlHoUGLQrRK2VxBZjqF0Fro0Bi5II-X2zNc_RvcAGGuT5D1GVL-AwG9Tgs</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Dynamic Brain Transformer with Multi-level Attention for Functional Brain Network Analysis</title><source>arXiv.org</source><creator>Kan, Xuan ; Gu, Antonio Aodong Chen ; Cui, Hejie ; Guo, Ying ; Yang, Carl</creator><creatorcontrib>Kan, Xuan ; Gu, Antonio Aodong Chen ; Cui, Hejie ; Guo, Ying ; Yang, Carl</creatorcontrib><description>Recent neuroimaging studies have highlighted the importance of network-centric brain analysis, particularly with functional magnetic resonance imaging. The emergence of Deep Neural Networks has fostered a substantial interest in predicting clinical outcomes and categorizing individuals based on brain networks. However, the conventional approach involving static brain network analysis offers limited potential in capturing the dynamism of brain function. Although recent studies have attempted to harness dynamic brain networks, their high dimensionality and complexity present substantial challenges. This paper proposes a novel methodology, Dynamic bRAin Transformer (DART), which combines static and dynamic brain networks for more effective and nuanced brain function analysis. Our model uses the static brain network as a baseline, integrating dynamic brain networks to enhance performance against traditional methods. We innovatively employ attention mechanisms, enhancing model explainability and exploiting the dynamic brain network's temporal variations. The proposed approach offers a robust solution to the low signal-to-noise ratio of blood-oxygen-level-dependent signals, a recurring issue in direct DNN modeling. It also provides valuable insights into which brain circuits or dynamic networks contribute more to final predictions. As such, DRAT shows a promising direction in neuroimaging studies, contributing to the comprehensive understanding of brain organization and the role of neural circuits.</description><identifier>DOI: 10.48550/arxiv.2309.01941</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Learning ; Quantitative Biology - Neurons and Cognition</subject><creationdate>2023-09</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2309.01941$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2309.01941$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Kan, Xuan</creatorcontrib><creatorcontrib>Gu, Antonio Aodong Chen</creatorcontrib><creatorcontrib>Cui, Hejie</creatorcontrib><creatorcontrib>Guo, Ying</creatorcontrib><creatorcontrib>Yang, Carl</creatorcontrib><title>Dynamic Brain Transformer with Multi-level Attention for Functional Brain Network Analysis</title><description>Recent neuroimaging studies have highlighted the importance of network-centric brain analysis, particularly with functional magnetic resonance imaging. The emergence of Deep Neural Networks has fostered a substantial interest in predicting clinical outcomes and categorizing individuals based on brain networks. However, the conventional approach involving static brain network analysis offers limited potential in capturing the dynamism of brain function. Although recent studies have attempted to harness dynamic brain networks, their high dimensionality and complexity present substantial challenges. This paper proposes a novel methodology, Dynamic bRAin Transformer (DART), which combines static and dynamic brain networks for more effective and nuanced brain function analysis. Our model uses the static brain network as a baseline, integrating dynamic brain networks to enhance performance against traditional methods. We innovatively employ attention mechanisms, enhancing model explainability and exploiting the dynamic brain network's temporal variations. The proposed approach offers a robust solution to the low signal-to-noise ratio of blood-oxygen-level-dependent signals, a recurring issue in direct DNN modeling. It also provides valuable insights into which brain circuits or dynamic networks contribute more to final predictions. As such, DRAT shows a promising direction in neuroimaging studies, contributing to the comprehensive understanding of brain organization and the role of neural circuits.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Learning</subject><subject>Quantitative Biology - Neurons and Cognition</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8tOwzAURL1hgQofwAr_QIJfSexlKBSQCmyy6ia6cW5Ui8RBjtuSv6ev1WhGMyMdQh44S5XOMvYE4c_tUyGZSRk3it-SzcvsYXCWPgdwnlYB_NSNYcBADy5u6eeujy7pcY89LWNEH93o6bFBVztvTwb66_YL42EMP7Q8RvPkpjty00E_4f1VF6RavVbL92T9_faxLNcJ5AVPCq4z02jJZGERrBWmFSzrDKpGaom6sTlHoUGLQrRK2VxBZjqF0Fro0Bi5II-X2zNc_RvcAGGuT5D1GVL-AwG9Tgs</recordid><startdate>20230905</startdate><enddate>20230905</enddate><creator>Kan, Xuan</creator><creator>Gu, Antonio Aodong Chen</creator><creator>Cui, Hejie</creator><creator>Guo, Ying</creator><creator>Yang, Carl</creator><scope>AKY</scope><scope>ALC</scope><scope>GOX</scope></search><sort><creationdate>20230905</creationdate><title>Dynamic Brain Transformer with Multi-level Attention for Functional Brain Network Analysis</title><author>Kan, Xuan ; Gu, Antonio Aodong Chen ; Cui, Hejie ; Guo, Ying ; Yang, Carl</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a671-71859b83037ceacc29d205f9e4b383e8bc61e28a8272d44c64a59f4eadcafe993</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Learning</topic><topic>Quantitative Biology - Neurons and Cognition</topic><toplevel>online_resources</toplevel><creatorcontrib>Kan, Xuan</creatorcontrib><creatorcontrib>Gu, Antonio Aodong Chen</creatorcontrib><creatorcontrib>Cui, Hejie</creatorcontrib><creatorcontrib>Guo, Ying</creatorcontrib><creatorcontrib>Yang, Carl</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Quantitative Biology</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Kan, Xuan</au><au>Gu, Antonio Aodong Chen</au><au>Cui, Hejie</au><au>Guo, Ying</au><au>Yang, Carl</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Dynamic Brain Transformer with Multi-level Attention for Functional Brain Network Analysis</atitle><date>2023-09-05</date><risdate>2023</risdate><abstract>Recent neuroimaging studies have highlighted the importance of network-centric brain analysis, particularly with functional magnetic resonance imaging. The emergence of Deep Neural Networks has fostered a substantial interest in predicting clinical outcomes and categorizing individuals based on brain networks. However, the conventional approach involving static brain network analysis offers limited potential in capturing the dynamism of brain function. Although recent studies have attempted to harness dynamic brain networks, their high dimensionality and complexity present substantial challenges. This paper proposes a novel methodology, Dynamic bRAin Transformer (DART), which combines static and dynamic brain networks for more effective and nuanced brain function analysis. Our model uses the static brain network as a baseline, integrating dynamic brain networks to enhance performance against traditional methods. We innovatively employ attention mechanisms, enhancing model explainability and exploiting the dynamic brain network's temporal variations. The proposed approach offers a robust solution to the low signal-to-noise ratio of blood-oxygen-level-dependent signals, a recurring issue in direct DNN modeling. It also provides valuable insights into which brain circuits or dynamic networks contribute more to final predictions. As such, DRAT shows a promising direction in neuroimaging studies, contributing to the comprehensive understanding of brain organization and the role of neural circuits.</abstract><doi>10.48550/arxiv.2309.01941</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2309.01941
ispartof
issn
language eng
recordid cdi_arxiv_primary_2309_01941
source arXiv.org
subjects Computer Science - Artificial Intelligence
Computer Science - Learning
Quantitative Biology - Neurons and Cognition
title Dynamic Brain Transformer with Multi-level Attention for Functional Brain Network Analysis
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-18T20%3A01%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Dynamic%20Brain%20Transformer%20with%20Multi-level%20Attention%20for%20Functional%20Brain%20Network%20Analysis&rft.au=Kan,%20Xuan&rft.date=2023-09-05&rft_id=info:doi/10.48550/arxiv.2309.01941&rft_dat=%3Carxiv_GOX%3E2309_01941%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true