Parallel Multi‐Scale Network with Attention Mechanism for Pancreas Segmentation

In this paper, we address the task of segmenting small organs (i.e., the pancreas) from abdominal CT scans. As the target often occupies a relatively small region in the input image, deep neural networks can be easily confused by complex and variable backgrounds. We propose a method that uses a para...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEJ transactions on electrical and electronic engineering 2022-01, Vol.17 (1), p.110-119
Hauptverfasser: Long, Jianwu, Song, Xinlei, An, Yong, Li, Tong, Zhu, Jiangzhou
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 119
container_issue 1
container_start_page 110
container_title IEEJ transactions on electrical and electronic engineering
container_volume 17
creator Long, Jianwu
Song, Xinlei
An, Yong
Li, Tong
Zhu, Jiangzhou
description In this paper, we address the task of segmenting small organs (i.e., the pancreas) from abdominal CT scans. As the target often occupies a relatively small region in the input image, deep neural networks can be easily confused by complex and variable backgrounds. We propose a method that uses a parallel multi‐scale network with an attention mechanism for pancreas segmentation, which can better grasp the balance between the semantic segmentation, classification, and localization tasks. We use a parallel network to connect the feature maps between different bottleneck layers, which contain rich semantic information and complete spatial information. We apply an attention module to enhance the key features of semantic information. Then, we fuse the two modules and apply the fused module as attention information on the feature map to ensure the full fusion between contextual semantic information and spatial information, thereby improving segmentation accuracy. We conduct extensive experiments on the NIH pancreas segmentation data set. In particular, our model achieves a mean coefficient Dice of 86.6. © 2021 Institute of Electrical Engineers of Japan. Published by Wiley Periodicals LLC.
doi_str_mv 10.1002/tee.23493
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2607911623</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2607911623</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2973-145c313e3a24e04bda0736396de0f455ca44971abd19056e24e641955c939d2e3</originalsourceid><addsrcrecordid>eNp10MtOAjEUBuDGaCKiC9-giSsXA71NoUtCEE1AMeC6KTNnZLDMYFtC2PkIPqNPYnGMO1c9Sb9zyY_QNSUdSgjrBoAO40LxE9SiitNEqD49_at7_BxdeL8mREje77fQ88w4Yy1YPN3ZUH59fM4zYwE_QtjX7g3vy7DCgxCgCmVd4SlkK1OVfoOL2uGZqTIHxuM5vG6iMEdzic4KYz1c_b5t9HI3Wgzvk8nT-GE4mCQZi2ckVKQZpxy4YQKIWOaG9LjkSuZACpGmmRHxXGqWOVUklRCVFFTFD8VVzoC30U0zd-vq9x34oNf1zlVxpWaS9BSlkvGobhuVudp7B4XeunJj3EFToo-J6ZiY_kks2m5j96WFw_9QL0ajpuMbiZhtBg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2607911623</pqid></control><display><type>article</type><title>Parallel Multi‐Scale Network with Attention Mechanism for Pancreas Segmentation</title><source>Wiley Online Library Journals Frontfile Complete</source><creator>Long, Jianwu ; Song, Xinlei ; An, Yong ; Li, Tong ; Zhu, Jiangzhou</creator><creatorcontrib>Long, Jianwu ; Song, Xinlei ; An, Yong ; Li, Tong ; Zhu, Jiangzhou</creatorcontrib><description>In this paper, we address the task of segmenting small organs (i.e., the pancreas) from abdominal CT scans. As the target often occupies a relatively small region in the input image, deep neural networks can be easily confused by complex and variable backgrounds. We propose a method that uses a parallel multi‐scale network with an attention mechanism for pancreas segmentation, which can better grasp the balance between the semantic segmentation, classification, and localization tasks. We use a parallel network to connect the feature maps between different bottleneck layers, which contain rich semantic information and complete spatial information. We apply an attention module to enhance the key features of semantic information. Then, we fuse the two modules and apply the fused module as attention information on the feature map to ensure the full fusion between contextual semantic information and spatial information, thereby improving segmentation accuracy. We conduct extensive experiments on the NIH pancreas segmentation data set. In particular, our model achieves a mean coefficient Dice of 86.6. © 2021 Institute of Electrical Engineers of Japan. Published by Wiley Periodicals LLC.</description><identifier>ISSN: 1931-4973</identifier><identifier>EISSN: 1931-4981</identifier><identifier>DOI: 10.1002/tee.23493</identifier><language>eng</language><publisher>Hoboken, USA: John Wiley &amp; Sons, Inc</publisher><subject>Artificial neural networks ; attention module ; Computed tomography ; Feature maps ; Image segmentation ; Modules ; Multisensor fusion ; Organs ; Pancreas ; pancreas segmentation ; parallel multi‐scale network ; Semantic segmentation ; Semantics ; Spatial data</subject><ispartof>IEEJ transactions on electrical and electronic engineering, 2022-01, Vol.17 (1), p.110-119</ispartof><rights>2021 Institute of Electrical Engineers of Japan. Published by Wiley Periodicals LLC.</rights><rights>Copyright © 2022 Institute of Electrical Engineers of Japan. Published by Wiley Periodicals LLC.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c2973-145c313e3a24e04bda0736396de0f455ca44971abd19056e24e641955c939d2e3</citedby><cites>FETCH-LOGICAL-c2973-145c313e3a24e04bda0736396de0f455ca44971abd19056e24e641955c939d2e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Ftee.23493$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Ftee.23493$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,776,780,1411,27903,27904,45553,45554</link.rule.ids></links><search><creatorcontrib>Long, Jianwu</creatorcontrib><creatorcontrib>Song, Xinlei</creatorcontrib><creatorcontrib>An, Yong</creatorcontrib><creatorcontrib>Li, Tong</creatorcontrib><creatorcontrib>Zhu, Jiangzhou</creatorcontrib><title>Parallel Multi‐Scale Network with Attention Mechanism for Pancreas Segmentation</title><title>IEEJ transactions on electrical and electronic engineering</title><description>In this paper, we address the task of segmenting small organs (i.e., the pancreas) from abdominal CT scans. As the target often occupies a relatively small region in the input image, deep neural networks can be easily confused by complex and variable backgrounds. We propose a method that uses a parallel multi‐scale network with an attention mechanism for pancreas segmentation, which can better grasp the balance between the semantic segmentation, classification, and localization tasks. We use a parallel network to connect the feature maps between different bottleneck layers, which contain rich semantic information and complete spatial information. We apply an attention module to enhance the key features of semantic information. Then, we fuse the two modules and apply the fused module as attention information on the feature map to ensure the full fusion between contextual semantic information and spatial information, thereby improving segmentation accuracy. We conduct extensive experiments on the NIH pancreas segmentation data set. In particular, our model achieves a mean coefficient Dice of 86.6. © 2021 Institute of Electrical Engineers of Japan. Published by Wiley Periodicals LLC.</description><subject>Artificial neural networks</subject><subject>attention module</subject><subject>Computed tomography</subject><subject>Feature maps</subject><subject>Image segmentation</subject><subject>Modules</subject><subject>Multisensor fusion</subject><subject>Organs</subject><subject>Pancreas</subject><subject>pancreas segmentation</subject><subject>parallel multi‐scale network</subject><subject>Semantic segmentation</subject><subject>Semantics</subject><subject>Spatial data</subject><issn>1931-4973</issn><issn>1931-4981</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp10MtOAjEUBuDGaCKiC9-giSsXA71NoUtCEE1AMeC6KTNnZLDMYFtC2PkIPqNPYnGMO1c9Sb9zyY_QNSUdSgjrBoAO40LxE9SiitNEqD49_at7_BxdeL8mREje77fQ88w4Yy1YPN3ZUH59fM4zYwE_QtjX7g3vy7DCgxCgCmVd4SlkK1OVfoOL2uGZqTIHxuM5vG6iMEdzic4KYz1c_b5t9HI3Wgzvk8nT-GE4mCQZi2ckVKQZpxy4YQKIWOaG9LjkSuZACpGmmRHxXGqWOVUklRCVFFTFD8VVzoC30U0zd-vq9x34oNf1zlVxpWaS9BSlkvGobhuVudp7B4XeunJj3EFToo-J6ZiY_kks2m5j96WFw_9QL0ajpuMbiZhtBg</recordid><startdate>202201</startdate><enddate>202201</enddate><creator>Long, Jianwu</creator><creator>Song, Xinlei</creator><creator>An, Yong</creator><creator>Li, Tong</creator><creator>Zhu, Jiangzhou</creator><general>John Wiley &amp; Sons, Inc</general><general>Wiley Subscription Services, Inc</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>L7M</scope></search><sort><creationdate>202201</creationdate><title>Parallel Multi‐Scale Network with Attention Mechanism for Pancreas Segmentation</title><author>Long, Jianwu ; Song, Xinlei ; An, Yong ; Li, Tong ; Zhu, Jiangzhou</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2973-145c313e3a24e04bda0736396de0f455ca44971abd19056e24e641955c939d2e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Artificial neural networks</topic><topic>attention module</topic><topic>Computed tomography</topic><topic>Feature maps</topic><topic>Image segmentation</topic><topic>Modules</topic><topic>Multisensor fusion</topic><topic>Organs</topic><topic>Pancreas</topic><topic>pancreas segmentation</topic><topic>parallel multi‐scale network</topic><topic>Semantic segmentation</topic><topic>Semantics</topic><topic>Spatial data</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Long, Jianwu</creatorcontrib><creatorcontrib>Song, Xinlei</creatorcontrib><creatorcontrib>An, Yong</creatorcontrib><creatorcontrib>Li, Tong</creatorcontrib><creatorcontrib>Zhu, Jiangzhou</creatorcontrib><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEJ transactions on electrical and electronic engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Long, Jianwu</au><au>Song, Xinlei</au><au>An, Yong</au><au>Li, Tong</au><au>Zhu, Jiangzhou</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Parallel Multi‐Scale Network with Attention Mechanism for Pancreas Segmentation</atitle><jtitle>IEEJ transactions on electrical and electronic engineering</jtitle><date>2022-01</date><risdate>2022</risdate><volume>17</volume><issue>1</issue><spage>110</spage><epage>119</epage><pages>110-119</pages><issn>1931-4973</issn><eissn>1931-4981</eissn><abstract>In this paper, we address the task of segmenting small organs (i.e., the pancreas) from abdominal CT scans. As the target often occupies a relatively small region in the input image, deep neural networks can be easily confused by complex and variable backgrounds. We propose a method that uses a parallel multi‐scale network with an attention mechanism for pancreas segmentation, which can better grasp the balance between the semantic segmentation, classification, and localization tasks. We use a parallel network to connect the feature maps between different bottleneck layers, which contain rich semantic information and complete spatial information. We apply an attention module to enhance the key features of semantic information. Then, we fuse the two modules and apply the fused module as attention information on the feature map to ensure the full fusion between contextual semantic information and spatial information, thereby improving segmentation accuracy. We conduct extensive experiments on the NIH pancreas segmentation data set. In particular, our model achieves a mean coefficient Dice of 86.6. © 2021 Institute of Electrical Engineers of Japan. Published by Wiley Periodicals LLC.</abstract><cop>Hoboken, USA</cop><pub>John Wiley &amp; Sons, Inc</pub><doi>10.1002/tee.23493</doi><tpages>10</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1931-4973
ispartof IEEJ transactions on electrical and electronic engineering, 2022-01, Vol.17 (1), p.110-119
issn 1931-4973
1931-4981
language eng
recordid cdi_proquest_journals_2607911623
source Wiley Online Library Journals Frontfile Complete
subjects Artificial neural networks
attention module
Computed tomography
Feature maps
Image segmentation
Modules
Multisensor fusion
Organs
Pancreas
pancreas segmentation
parallel multi‐scale network
Semantic segmentation
Semantics
Spatial data
title Parallel Multi‐Scale Network with Attention Mechanism for Pancreas Segmentation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-25T15%3A23%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Parallel%20Multi%E2%80%90Scale%20Network%20with%20Attention%20Mechanism%20for%20Pancreas%20Segmentation&rft.jtitle=IEEJ%20transactions%20on%20electrical%20and%20electronic%20engineering&rft.au=Long,%20Jianwu&rft.date=2022-01&rft.volume=17&rft.issue=1&rft.spage=110&rft.epage=119&rft.pages=110-119&rft.issn=1931-4973&rft.eissn=1931-4981&rft_id=info:doi/10.1002/tee.23493&rft_dat=%3Cproquest_cross%3E2607911623%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2607911623&rft_id=info:pmid/&rfr_iscdi=true