A hybrid network for fiber orientation distribution reconstruction employing multi‐scale information
Background Accurate fiber orientation distribution (FOD) is crucial for resolving complex neural fiber structures. However, existing reconstruction methods often fail to integrate both global and local FOD information, as well as the directional information of fixels, which limits reconstruction acc...
Gespeichert in:
Veröffentlicht in: | Medical physics (Lancaster) 2025-02, Vol.52 (2), p.1019-1036 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1036 |
---|---|
container_issue | 2 |
container_start_page | 1019 |
container_title | Medical physics (Lancaster) |
container_volume | 52 |
creator | Yu, Hanyang Ai, Lingmei Yao, Ruoxia Li, Jiahao |
description | Background
Accurate fiber orientation distribution (FOD) is crucial for resolving complex neural fiber structures. However, existing reconstruction methods often fail to integrate both global and local FOD information, as well as the directional information of fixels, which limits reconstruction accuracy. Additionally, these methods overlook the spatial positional relationships between voxels, resulting in extracted features that lack continuity. In regions with signal distortion, many methods also exhibit issues with reconstruction artifacts.
Purpose
This study addresses these challenges by introducing a new neural network called Fusion‐Net.
Methods
Fusion‐Net comprises both the FOD reconstruction network and the peak direction estimation network. The FOD reconstruction network efficiently fuses the global and local features of the FOD, providing these features with spatial positional information through a competitive coordinate attention mechanism and a progressive updating mechanism, thus ensuring feature continuity. The peak direction estimation network redefines the task of estimating fixel peak directions as a multi‐class classification problem. It uses a direction‐aware loss function to supply directional information to the FOD reconstruction network. Additionally, we introduce a larger input scale for Fusion‐Net to compensate for local signal distortion by incorporating more global information.
Results
Experimental results demonstrate that the rich FOD features contribute to promising performance in Fusion‐Net. The network effectively utilizes these features to enhance reconstruction accuracy while incorporating more global information, effectively mitigating the issue of local signal distortion.
Conclusions
This study demonstrates the feasibility of Fusion‐Net for reconstructing FOD, providing reliable references for clinical applications. |
doi_str_mv | 10.1002/mp.17505 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_3131500256</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3131500256</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2125-b509f1e83ee0b53d9eb087d50dddee266f9a457ebc941c11ef2dafa7231eb7ac3</originalsourceid><addsrcrecordid>eNp1kLlOxDAURS0EgmGR-AKUkibwbMfJpBwhNmkQFFBHXp7BkMTBTjSajk_gG_kSwgxLRfV0pfNOcQg5pHBCAdhp053QQoDYIBOWFTzNGJSbZAJQZinLQOyQ3RifASDnArbJDi9FLkqeT4idJU9LFZxJWuwXPrwk1ofEOoUh8cFh28ve-TYxLvbBqWE1AmrfjnvQq4lNV_ulax-TZqh79_H2HrWsMXHt6GpW__tky8o64sH33SMPF-f3Z1fp_Pby-mw2TzWjTKRKQGkpTjkiKMFNiQqmhRFgjEFkeW5LmYkClS4zqilFy4y0smCcoiqk5nvkeO3tgn8dMPZV46LGupYt-iFWnHIqxmIi_0N18DEGtFUXXCPDsqJQfVWtmq5aVR3Ro2_roBo0v-BPxhFI18DC1bj8V1Td3K2FnwHshJA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3131500256</pqid></control><display><type>article</type><title>A hybrid network for fiber orientation distribution reconstruction employing multi‐scale information</title><source>MEDLINE</source><source>Wiley Online Library Journals Frontfile Complete</source><creator>Yu, Hanyang ; Ai, Lingmei ; Yao, Ruoxia ; Li, Jiahao</creator><creatorcontrib>Yu, Hanyang ; Ai, Lingmei ; Yao, Ruoxia ; Li, Jiahao</creatorcontrib><description>Background
Accurate fiber orientation distribution (FOD) is crucial for resolving complex neural fiber structures. However, existing reconstruction methods often fail to integrate both global and local FOD information, as well as the directional information of fixels, which limits reconstruction accuracy. Additionally, these methods overlook the spatial positional relationships between voxels, resulting in extracted features that lack continuity. In regions with signal distortion, many methods also exhibit issues with reconstruction artifacts.
Purpose
This study addresses these challenges by introducing a new neural network called Fusion‐Net.
Methods
Fusion‐Net comprises both the FOD reconstruction network and the peak direction estimation network. The FOD reconstruction network efficiently fuses the global and local features of the FOD, providing these features with spatial positional information through a competitive coordinate attention mechanism and a progressive updating mechanism, thus ensuring feature continuity. The peak direction estimation network redefines the task of estimating fixel peak directions as a multi‐class classification problem. It uses a direction‐aware loss function to supply directional information to the FOD reconstruction network. Additionally, we introduce a larger input scale for Fusion‐Net to compensate for local signal distortion by incorporating more global information.
Results
Experimental results demonstrate that the rich FOD features contribute to promising performance in Fusion‐Net. The network effectively utilizes these features to enhance reconstruction accuracy while incorporating more global information, effectively mitigating the issue of local signal distortion.
Conclusions
This study demonstrates the feasibility of Fusion‐Net for reconstructing FOD, providing reliable references for clinical applications.</description><identifier>ISSN: 0094-2405</identifier><identifier>ISSN: 2473-4209</identifier><identifier>EISSN: 2473-4209</identifier><identifier>DOI: 10.1002/mp.17505</identifier><identifier>PMID: 39565936</identifier><language>eng</language><publisher>United States</publisher><subject>attention mechanism ; convolutional neural network ; fiber orientation distribution reconstruction ; Image Processing, Computer-Assisted - methods ; multi‐layer perceptron ; Neural Networks, Computer ; peak direction estimation</subject><ispartof>Medical physics (Lancaster), 2025-02, Vol.52 (2), p.1019-1036</ispartof><rights>2024 American Association of Physicists in Medicine.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c2125-b509f1e83ee0b53d9eb087d50dddee266f9a457ebc941c11ef2dafa7231eb7ac3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Fmp.17505$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Fmp.17505$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,776,780,1411,27901,27902,45550,45551</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39565936$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Yu, Hanyang</creatorcontrib><creatorcontrib>Ai, Lingmei</creatorcontrib><creatorcontrib>Yao, Ruoxia</creatorcontrib><creatorcontrib>Li, Jiahao</creatorcontrib><title>A hybrid network for fiber orientation distribution reconstruction employing multi‐scale information</title><title>Medical physics (Lancaster)</title><addtitle>Med Phys</addtitle><description>Background
Accurate fiber orientation distribution (FOD) is crucial for resolving complex neural fiber structures. However, existing reconstruction methods often fail to integrate both global and local FOD information, as well as the directional information of fixels, which limits reconstruction accuracy. Additionally, these methods overlook the spatial positional relationships between voxels, resulting in extracted features that lack continuity. In regions with signal distortion, many methods also exhibit issues with reconstruction artifacts.
Purpose
This study addresses these challenges by introducing a new neural network called Fusion‐Net.
Methods
Fusion‐Net comprises both the FOD reconstruction network and the peak direction estimation network. The FOD reconstruction network efficiently fuses the global and local features of the FOD, providing these features with spatial positional information through a competitive coordinate attention mechanism and a progressive updating mechanism, thus ensuring feature continuity. The peak direction estimation network redefines the task of estimating fixel peak directions as a multi‐class classification problem. It uses a direction‐aware loss function to supply directional information to the FOD reconstruction network. Additionally, we introduce a larger input scale for Fusion‐Net to compensate for local signal distortion by incorporating more global information.
Results
Experimental results demonstrate that the rich FOD features contribute to promising performance in Fusion‐Net. The network effectively utilizes these features to enhance reconstruction accuracy while incorporating more global information, effectively mitigating the issue of local signal distortion.
Conclusions
This study demonstrates the feasibility of Fusion‐Net for reconstructing FOD, providing reliable references for clinical applications.</description><subject>attention mechanism</subject><subject>convolutional neural network</subject><subject>fiber orientation distribution reconstruction</subject><subject>Image Processing, Computer-Assisted - methods</subject><subject>multi‐layer perceptron</subject><subject>Neural Networks, Computer</subject><subject>peak direction estimation</subject><issn>0094-2405</issn><issn>2473-4209</issn><issn>2473-4209</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2025</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp1kLlOxDAURS0EgmGR-AKUkibwbMfJpBwhNmkQFFBHXp7BkMTBTjSajk_gG_kSwgxLRfV0pfNOcQg5pHBCAdhp053QQoDYIBOWFTzNGJSbZAJQZinLQOyQ3RifASDnArbJDi9FLkqeT4idJU9LFZxJWuwXPrwk1ofEOoUh8cFh28ve-TYxLvbBqWE1AmrfjnvQq4lNV_ulax-TZqh79_H2HrWsMXHt6GpW__tky8o64sH33SMPF-f3Z1fp_Pby-mw2TzWjTKRKQGkpTjkiKMFNiQqmhRFgjEFkeW5LmYkClS4zqilFy4y0smCcoiqk5nvkeO3tgn8dMPZV46LGupYt-iFWnHIqxmIi_0N18DEGtFUXXCPDsqJQfVWtmq5aVR3Ro2_roBo0v-BPxhFI18DC1bj8V1Td3K2FnwHshJA</recordid><startdate>202502</startdate><enddate>202502</enddate><creator>Yu, Hanyang</creator><creator>Ai, Lingmei</creator><creator>Yao, Ruoxia</creator><creator>Li, Jiahao</creator><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>202502</creationdate><title>A hybrid network for fiber orientation distribution reconstruction employing multi‐scale information</title><author>Yu, Hanyang ; Ai, Lingmei ; Yao, Ruoxia ; Li, Jiahao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2125-b509f1e83ee0b53d9eb087d50dddee266f9a457ebc941c11ef2dafa7231eb7ac3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2025</creationdate><topic>attention mechanism</topic><topic>convolutional neural network</topic><topic>fiber orientation distribution reconstruction</topic><topic>Image Processing, Computer-Assisted - methods</topic><topic>multi‐layer perceptron</topic><topic>Neural Networks, Computer</topic><topic>peak direction estimation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yu, Hanyang</creatorcontrib><creatorcontrib>Ai, Lingmei</creatorcontrib><creatorcontrib>Yao, Ruoxia</creatorcontrib><creatorcontrib>Li, Jiahao</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Medical physics (Lancaster)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yu, Hanyang</au><au>Ai, Lingmei</au><au>Yao, Ruoxia</au><au>Li, Jiahao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A hybrid network for fiber orientation distribution reconstruction employing multi‐scale information</atitle><jtitle>Medical physics (Lancaster)</jtitle><addtitle>Med Phys</addtitle><date>2025-02</date><risdate>2025</risdate><volume>52</volume><issue>2</issue><spage>1019</spage><epage>1036</epage><pages>1019-1036</pages><issn>0094-2405</issn><issn>2473-4209</issn><eissn>2473-4209</eissn><abstract>Background
Accurate fiber orientation distribution (FOD) is crucial for resolving complex neural fiber structures. However, existing reconstruction methods often fail to integrate both global and local FOD information, as well as the directional information of fixels, which limits reconstruction accuracy. Additionally, these methods overlook the spatial positional relationships between voxels, resulting in extracted features that lack continuity. In regions with signal distortion, many methods also exhibit issues with reconstruction artifacts.
Purpose
This study addresses these challenges by introducing a new neural network called Fusion‐Net.
Methods
Fusion‐Net comprises both the FOD reconstruction network and the peak direction estimation network. The FOD reconstruction network efficiently fuses the global and local features of the FOD, providing these features with spatial positional information through a competitive coordinate attention mechanism and a progressive updating mechanism, thus ensuring feature continuity. The peak direction estimation network redefines the task of estimating fixel peak directions as a multi‐class classification problem. It uses a direction‐aware loss function to supply directional information to the FOD reconstruction network. Additionally, we introduce a larger input scale for Fusion‐Net to compensate for local signal distortion by incorporating more global information.
Results
Experimental results demonstrate that the rich FOD features contribute to promising performance in Fusion‐Net. The network effectively utilizes these features to enhance reconstruction accuracy while incorporating more global information, effectively mitigating the issue of local signal distortion.
Conclusions
This study demonstrates the feasibility of Fusion‐Net for reconstructing FOD, providing reliable references for clinical applications.</abstract><cop>United States</cop><pmid>39565936</pmid><doi>10.1002/mp.17505</doi><tpages>18</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0094-2405 |
ispartof | Medical physics (Lancaster), 2025-02, Vol.52 (2), p.1019-1036 |
issn | 0094-2405 2473-4209 2473-4209 |
language | eng |
recordid | cdi_proquest_miscellaneous_3131500256 |
source | MEDLINE; Wiley Online Library Journals Frontfile Complete |
subjects | attention mechanism convolutional neural network fiber orientation distribution reconstruction Image Processing, Computer-Assisted - methods multi‐layer perceptron Neural Networks, Computer peak direction estimation |
title | A hybrid network for fiber orientation distribution reconstruction employing multi‐scale information |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T09%3A13%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20hybrid%20network%20for%20fiber%20orientation%20distribution%20reconstruction%20employing%20multi%E2%80%90scale%20information&rft.jtitle=Medical%20physics%20(Lancaster)&rft.au=Yu,%20Hanyang&rft.date=2025-02&rft.volume=52&rft.issue=2&rft.spage=1019&rft.epage=1036&rft.pages=1019-1036&rft.issn=0094-2405&rft.eissn=2473-4209&rft_id=info:doi/10.1002/mp.17505&rft_dat=%3Cproquest_cross%3E3131500256%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3131500256&rft_id=info:pmid/39565936&rfr_iscdi=true |