AIU-Net: An Efficient Deep Convolutional Neural Network for Brain Tumor Segmentation

Automatic and accurate segmentation of brain tumors plays an important role in the diagnosis and treatment of brain tumors. In order to improve the accuracy of brain tumor segmentation, an improved multimodal MRI brain tumor segmentation algorithm based on U-net is proposed in this paper. In the ori...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Mathematical problems in engineering 2021-08, Vol.2021, p.1-8
Hauptverfasser: Jiang, Yongchao, Ye, Mingquan, Huang, Daobin, Lu, Xiaojie
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 8
container_issue
container_start_page 1
container_title Mathematical problems in engineering
container_volume 2021
creator Jiang, Yongchao
Ye, Mingquan
Huang, Daobin
Lu, Xiaojie
description Automatic and accurate segmentation of brain tumors plays an important role in the diagnosis and treatment of brain tumors. In order to improve the accuracy of brain tumor segmentation, an improved multimodal MRI brain tumor segmentation algorithm based on U-net is proposed in this paper. In the original U-net, the contracting path uses the pooling layer to reduce the resolution of the feature image and increase the receptive field. In the expanding path, the up sampling is used to restore the size of the feature image. In this process, some details of the image will be lost, leading to low segmentation accuracy. This paper proposes an improved convolutional neural network named AIU-net (Atrous-Inception U-net). In the encoder of U-net, A-inception (Atrous-inception) module is introduced to replace the original convolution block. The A-inception module is an inception structure with atrous convolution, which increases the depth and width of the network and can expand the receptive field without adding additional parameters. In order to capture the multiscale features, the atrous spatial pyramid pooling module (ASPP) is introduced. The experimental results on the BraTS (the multimodal brain tumor segmentation challenge) dataset show that the dice score obtained by this method is 0.93 for the enhancing tumor region, 0.86 for the whole tumor region, and 0.92 for the tumor core region, and the segmentation accuracy is improved.
doi_str_mv 10.1155/2021/7915706
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2561329261</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2561329261</sourcerecordid><originalsourceid>FETCH-LOGICAL-c337t-b0c7537524efbf26c422f34db6bc161341d6c4d660061c230c8d3ceed001bc073</originalsourceid><addsrcrecordid>eNp9kEFPAjEQhRujiYje_AFNPOpKp9224A0RlYTgQUi8NbvdVouwxXZX4r-3CGcv814m37xMHkKXQG4BOO9RQqEnB8AlEUeoA1ywjEMuj5MnNM-AsrdTdBbjkiSSQ7-D5sPJIpuZ5g4Pazy21mln6gY_GLPBI19_-1XbOF8XKzwzbfiTZuvDJ7Y-4PtQuBrP23Xyr-Z9nS6LHX2OTmyxiubioF20eBzPR8_Z9OVpMhpOM82YbLKSaMmZ5DQ3trRU6JxSy_KqFKUGASyHKu0qIQgRoCkjul8xbUxFCJSaSNZFV_vcTfBfrYmNWvo2pGejojwF0AFNs4tu9pQOPsZgrNoEty7CjwKidr2pXW_q0FvCr_f4h6urYuv-p38BeetrUA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2561329261</pqid></control><display><type>article</type><title>AIU-Net: An Efficient Deep Convolutional Neural Network for Brain Tumor Segmentation</title><source>EZB-FREE-00999 freely available EZB journals</source><source>Wiley Online Library (Open Access Collection)</source><source>Alma/SFX Local Collection</source><creator>Jiang, Yongchao ; Ye, Mingquan ; Huang, Daobin ; Lu, Xiaojie</creator><contributor>Wu, Shianghau ; Shianghau Wu</contributor><creatorcontrib>Jiang, Yongchao ; Ye, Mingquan ; Huang, Daobin ; Lu, Xiaojie ; Wu, Shianghau ; Shianghau Wu</creatorcontrib><description>Automatic and accurate segmentation of brain tumors plays an important role in the diagnosis and treatment of brain tumors. In order to improve the accuracy of brain tumor segmentation, an improved multimodal MRI brain tumor segmentation algorithm based on U-net is proposed in this paper. In the original U-net, the contracting path uses the pooling layer to reduce the resolution of the feature image and increase the receptive field. In the expanding path, the up sampling is used to restore the size of the feature image. In this process, some details of the image will be lost, leading to low segmentation accuracy. This paper proposes an improved convolutional neural network named AIU-net (Atrous-Inception U-net). In the encoder of U-net, A-inception (Atrous-inception) module is introduced to replace the original convolution block. The A-inception module is an inception structure with atrous convolution, which increases the depth and width of the network and can expand the receptive field without adding additional parameters. In order to capture the multiscale features, the atrous spatial pyramid pooling module (ASPP) is introduced. The experimental results on the BraTS (the multimodal brain tumor segmentation challenge) dataset show that the dice score obtained by this method is 0.93 for the enhancing tumor region, 0.86 for the whole tumor region, and 0.92 for the tumor core region, and the segmentation accuracy is improved.</description><identifier>ISSN: 1024-123X</identifier><identifier>EISSN: 1563-5147</identifier><identifier>DOI: 10.1155/2021/7915706</identifier><language>eng</language><publisher>New York: Hindawi</publisher><subject>Accuracy ; Algorithms ; Artificial neural networks ; Brain ; Brain cancer ; Coders ; Deep learning ; Image restoration ; Image segmentation ; Magnetic resonance imaging ; Medical diagnosis ; Medical imaging ; Modules ; Neural networks ; Semantics ; Tumors</subject><ispartof>Mathematical problems in engineering, 2021-08, Vol.2021, p.1-8</ispartof><rights>Copyright © 2021 Yongchao Jiang et al.</rights><rights>Copyright © 2021 Yongchao Jiang et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c337t-b0c7537524efbf26c422f34db6bc161341d6c4d660061c230c8d3ceed001bc073</citedby><cites>FETCH-LOGICAL-c337t-b0c7537524efbf26c422f34db6bc161341d6c4d660061c230c8d3ceed001bc073</cites><orcidid>0000-0001-9432-2382 ; 0000-0001-5394-1742 ; 0000-0002-0237-4159 ; 0000-0002-5165-7796</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><contributor>Wu, Shianghau</contributor><contributor>Shianghau Wu</contributor><creatorcontrib>Jiang, Yongchao</creatorcontrib><creatorcontrib>Ye, Mingquan</creatorcontrib><creatorcontrib>Huang, Daobin</creatorcontrib><creatorcontrib>Lu, Xiaojie</creatorcontrib><title>AIU-Net: An Efficient Deep Convolutional Neural Network for Brain Tumor Segmentation</title><title>Mathematical problems in engineering</title><description>Automatic and accurate segmentation of brain tumors plays an important role in the diagnosis and treatment of brain tumors. In order to improve the accuracy of brain tumor segmentation, an improved multimodal MRI brain tumor segmentation algorithm based on U-net is proposed in this paper. In the original U-net, the contracting path uses the pooling layer to reduce the resolution of the feature image and increase the receptive field. In the expanding path, the up sampling is used to restore the size of the feature image. In this process, some details of the image will be lost, leading to low segmentation accuracy. This paper proposes an improved convolutional neural network named AIU-net (Atrous-Inception U-net). In the encoder of U-net, A-inception (Atrous-inception) module is introduced to replace the original convolution block. The A-inception module is an inception structure with atrous convolution, which increases the depth and width of the network and can expand the receptive field without adding additional parameters. In order to capture the multiscale features, the atrous spatial pyramid pooling module (ASPP) is introduced. The experimental results on the BraTS (the multimodal brain tumor segmentation challenge) dataset show that the dice score obtained by this method is 0.93 for the enhancing tumor region, 0.86 for the whole tumor region, and 0.92 for the tumor core region, and the segmentation accuracy is improved.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Brain</subject><subject>Brain cancer</subject><subject>Coders</subject><subject>Deep learning</subject><subject>Image restoration</subject><subject>Image segmentation</subject><subject>Magnetic resonance imaging</subject><subject>Medical diagnosis</subject><subject>Medical imaging</subject><subject>Modules</subject><subject>Neural networks</subject><subject>Semantics</subject><subject>Tumors</subject><issn>1024-123X</issn><issn>1563-5147</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RHX</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kEFPAjEQhRujiYje_AFNPOpKp9224A0RlYTgQUi8NbvdVouwxXZX4r-3CGcv814m37xMHkKXQG4BOO9RQqEnB8AlEUeoA1ywjEMuj5MnNM-AsrdTdBbjkiSSQ7-D5sPJIpuZ5g4Pazy21mln6gY_GLPBI19_-1XbOF8XKzwzbfiTZuvDJ7Y-4PtQuBrP23Xyr-Z9nS6LHX2OTmyxiubioF20eBzPR8_Z9OVpMhpOM82YbLKSaMmZ5DQ3trRU6JxSy_KqFKUGASyHKu0qIQgRoCkjul8xbUxFCJSaSNZFV_vcTfBfrYmNWvo2pGejojwF0AFNs4tu9pQOPsZgrNoEty7CjwKidr2pXW_q0FvCr_f4h6urYuv-p38BeetrUA</recordid><startdate>20210804</startdate><enddate>20210804</enddate><creator>Jiang, Yongchao</creator><creator>Ye, Mingquan</creator><creator>Huang, Daobin</creator><creator>Lu, Xiaojie</creator><general>Hindawi</general><general>Hindawi Limited</general><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7TB</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>CWDGH</scope><scope>DWQXO</scope><scope>FR3</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>KR7</scope><scope>L6V</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><orcidid>https://orcid.org/0000-0001-9432-2382</orcidid><orcidid>https://orcid.org/0000-0001-5394-1742</orcidid><orcidid>https://orcid.org/0000-0002-0237-4159</orcidid><orcidid>https://orcid.org/0000-0002-5165-7796</orcidid></search><sort><creationdate>20210804</creationdate><title>AIU-Net: An Efficient Deep Convolutional Neural Network for Brain Tumor Segmentation</title><author>Jiang, Yongchao ; Ye, Mingquan ; Huang, Daobin ; Lu, Xiaojie</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c337t-b0c7537524efbf26c422f34db6bc161341d6c4d660061c230c8d3ceed001bc073</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Brain</topic><topic>Brain cancer</topic><topic>Coders</topic><topic>Deep learning</topic><topic>Image restoration</topic><topic>Image segmentation</topic><topic>Magnetic resonance imaging</topic><topic>Medical diagnosis</topic><topic>Medical imaging</topic><topic>Modules</topic><topic>Neural networks</topic><topic>Semantics</topic><topic>Tumors</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Jiang, Yongchao</creatorcontrib><creatorcontrib>Ye, Mingquan</creatorcontrib><creatorcontrib>Huang, Daobin</creatorcontrib><creatorcontrib>Lu, Xiaojie</creatorcontrib><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access</collection><collection>CrossRef</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>Middle East &amp; Africa Database</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><jtitle>Mathematical problems in engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Jiang, Yongchao</au><au>Ye, Mingquan</au><au>Huang, Daobin</au><au>Lu, Xiaojie</au><au>Wu, Shianghau</au><au>Shianghau Wu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>AIU-Net: An Efficient Deep Convolutional Neural Network for Brain Tumor Segmentation</atitle><jtitle>Mathematical problems in engineering</jtitle><date>2021-08-04</date><risdate>2021</risdate><volume>2021</volume><spage>1</spage><epage>8</epage><pages>1-8</pages><issn>1024-123X</issn><eissn>1563-5147</eissn><abstract>Automatic and accurate segmentation of brain tumors plays an important role in the diagnosis and treatment of brain tumors. In order to improve the accuracy of brain tumor segmentation, an improved multimodal MRI brain tumor segmentation algorithm based on U-net is proposed in this paper. In the original U-net, the contracting path uses the pooling layer to reduce the resolution of the feature image and increase the receptive field. In the expanding path, the up sampling is used to restore the size of the feature image. In this process, some details of the image will be lost, leading to low segmentation accuracy. This paper proposes an improved convolutional neural network named AIU-net (Atrous-Inception U-net). In the encoder of U-net, A-inception (Atrous-inception) module is introduced to replace the original convolution block. The A-inception module is an inception structure with atrous convolution, which increases the depth and width of the network and can expand the receptive field without adding additional parameters. In order to capture the multiscale features, the atrous spatial pyramid pooling module (ASPP) is introduced. The experimental results on the BraTS (the multimodal brain tumor segmentation challenge) dataset show that the dice score obtained by this method is 0.93 for the enhancing tumor region, 0.86 for the whole tumor region, and 0.92 for the tumor core region, and the segmentation accuracy is improved.</abstract><cop>New York</cop><pub>Hindawi</pub><doi>10.1155/2021/7915706</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0001-9432-2382</orcidid><orcidid>https://orcid.org/0000-0001-5394-1742</orcidid><orcidid>https://orcid.org/0000-0002-0237-4159</orcidid><orcidid>https://orcid.org/0000-0002-5165-7796</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1024-123X
ispartof Mathematical problems in engineering, 2021-08, Vol.2021, p.1-8
issn 1024-123X
1563-5147
language eng
recordid cdi_proquest_journals_2561329261
source EZB-FREE-00999 freely available EZB journals; Wiley Online Library (Open Access Collection); Alma/SFX Local Collection
subjects Accuracy
Algorithms
Artificial neural networks
Brain
Brain cancer
Coders
Deep learning
Image restoration
Image segmentation
Magnetic resonance imaging
Medical diagnosis
Medical imaging
Modules
Neural networks
Semantics
Tumors
title AIU-Net: An Efficient Deep Convolutional Neural Network for Brain Tumor Segmentation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-03T08%3A13%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=AIU-Net:%20An%20Efficient%20Deep%20Convolutional%20Neural%20Network%20for%20Brain%20Tumor%20Segmentation&rft.jtitle=Mathematical%20problems%20in%20engineering&rft.au=Jiang,%20Yongchao&rft.date=2021-08-04&rft.volume=2021&rft.spage=1&rft.epage=8&rft.pages=1-8&rft.issn=1024-123X&rft.eissn=1563-5147&rft_id=info:doi/10.1155/2021/7915706&rft_dat=%3Cproquest_cross%3E2561329261%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2561329261&rft_id=info:pmid/&rfr_iscdi=true