BGRD-TransUNet: A Novel TransUNet-Based Model for Ultrasound Breast Lesion Segmentation

Breast UltraSound (BUS) imaging is a commonly used diagnostic tool in the field of counter fighting breast diseases, especially for early detection and diagnosis of breast cancer. Due to the inherent characteristics of ultrasound images such as blurry boundaries and diverse tumor morphologies, it is...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.31182-31196
Hauptverfasser: Ji, Zhanlin, Sun, Haoran, Yuan, Na, Zhang, Haiyang, Sheng, Jiaxi, Zhang, Xueji, Ganchev, Ivan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 31196
container_issue
container_start_page 31182
container_title IEEE access
container_volume 12
creator Ji, Zhanlin
Sun, Haoran
Yuan, Na
Zhang, Haiyang
Sheng, Jiaxi
Zhang, Xueji
Ganchev, Ivan
description Breast UltraSound (BUS) imaging is a commonly used diagnostic tool in the field of counter fighting breast diseases, especially for early detection and diagnosis of breast cancer. Due to the inherent characteristics of ultrasound images such as blurry boundaries and diverse tumor morphologies, it is challenging for doctors to manually segment breast tumors. In recent years, the Convolutional Neural Network (CNN) technology has been widely applied to automatically segment BUS images. However, due to the inherent limitations of CNNs in capturing global contextual information, it is difficult to capture the full context. To address this issue, the paper proposes a novel BGRD-TransUNet model for breast lesion segmentation, based on TransUNet. The proposed model, first, replaces the original ResNet50 backbone network of TransUNet with DenseNet121 for initial feature extraction. Next, newly designed Residual Multi-Scale Feature Modules (RMSFMs) are employed to extract features from various layers of DenseNet121, thus capturing richer features within specific layers. Thirdly, a Boundary Guidance (BG) network is added to enhance the contour information of BUS images. Additionally, newly designed Boundary Attentional Feature Fusion Modules (BAFFMs) are used to integrate edge information and features extracted through RMSFMs. Finally, newly designed Parallel Channel and Spatial Attention Modules (PCSAMs) are used to refine feature extraction using channel and spatial attention. An extensive experimental testing performed on two public datasets demonstrates that the proposed BGRD-TransUNet model outperforms all state-of-the-art medical image segmentation models, participating in the experiments, according to all evaluation metrics used (except for few separate cases), including the two most important and widely used metrics in the field of medical image segmentation, namely the Intersection over Union (IoU) and Dice Similarity Coefficient (DSC). More specifically, on the BUSI dataset and dataset B, BGRD-TransUNet achieves IoU values of 76.77% and 86.61%, and DSC values of 85.08% and 92.47%, respectively, which are higher by 7.27 and 3.64, and 5.81 and 2.54 percentage points, than the corresponding values achieved by the baseline (TransUNet).
doi_str_mv 10.1109/ACCESS.2024.3368170
format Article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_10442999</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10442999</ieee_id><doaj_id>oai_doaj_org_article_ef3bc2f3ae90490bacab581636f02f47</doaj_id><sourcerecordid>2933609631</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-f12f632086154ac684e78e3837f498b606bb027b070ec9aa74fa34a635e0115b3</originalsourceid><addsrcrecordid>eNpNUU1PwzAMrRBIINgvgEMlzh1JnKYNt22MD2kMiTFxjNzOmTqNBpIOiX9PoBPCF9tPfs-WX5KcczbknOmr0WQyXSyGggk5BFAlL9hBciK40hnkoA7_1cfJIIQNi1FGKC9Oktfx3fNN9uKxDcs5ddfpKJ27T9qmf1A2xkCr9NGtImqdT5fbzmNwu3aVjj1h6NIZhca16YLWb9R22MXmLDmyuA002OfTZHk7fZncZ7Onu4fJaJbVkOsus1xYBYKViucSa1VKKkqCEgordVkppqqKiaJiBaNaIxbSIkhUkBPjPK_gNHnodVcON-bdN2_ov4zDxvwCzq8N-q6pt2TIQlULC0iaSc0qrLHK4x9AWSasLKLWZa_17t3HjkJnNm7n23i-ETp-lmkFPE5BP1V7F4In-7eVM_NjiOkNMT-GmL0hkXXRsxoi-seQUmit4Ru3DYTu</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2933609631</pqid></control><display><type>article</type><title>BGRD-TransUNet: A Novel TransUNet-Based Model for Ultrasound Breast Lesion Segmentation</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Ji, Zhanlin ; Sun, Haoran ; Yuan, Na ; Zhang, Haiyang ; Sheng, Jiaxi ; Zhang, Xueji ; Ganchev, Ivan</creator><creatorcontrib>Ji, Zhanlin ; Sun, Haoran ; Yuan, Na ; Zhang, Haiyang ; Sheng, Jiaxi ; Zhang, Xueji ; Ganchev, Ivan</creatorcontrib><description>Breast UltraSound (BUS) imaging is a commonly used diagnostic tool in the field of counter fighting breast diseases, especially for early detection and diagnosis of breast cancer. Due to the inherent characteristics of ultrasound images such as blurry boundaries and diverse tumor morphologies, it is challenging for doctors to manually segment breast tumors. In recent years, the Convolutional Neural Network (CNN) technology has been widely applied to automatically segment BUS images. However, due to the inherent limitations of CNNs in capturing global contextual information, it is difficult to capture the full context. To address this issue, the paper proposes a novel BGRD-TransUNet model for breast lesion segmentation, based on TransUNet. The proposed model, first, replaces the original ResNet50 backbone network of TransUNet with DenseNet121 for initial feature extraction. Next, newly designed Residual Multi-Scale Feature Modules (RMSFMs) are employed to extract features from various layers of DenseNet121, thus capturing richer features within specific layers. Thirdly, a Boundary Guidance (BG) network is added to enhance the contour information of BUS images. Additionally, newly designed Boundary Attentional Feature Fusion Modules (BAFFMs) are used to integrate edge information and features extracted through RMSFMs. Finally, newly designed Parallel Channel and Spatial Attention Modules (PCSAMs) are used to refine feature extraction using channel and spatial attention. An extensive experimental testing performed on two public datasets demonstrates that the proposed BGRD-TransUNet model outperforms all state-of-the-art medical image segmentation models, participating in the experiments, according to all evaluation metrics used (except for few separate cases), including the two most important and widely used metrics in the field of medical image segmentation, namely the Intersection over Union (IoU) and Dice Similarity Coefficient (DSC). More specifically, on the BUSI dataset and dataset B, BGRD-TransUNet achieves IoU values of 76.77% and 86.61%, and DSC values of 85.08% and 92.47%, respectively, which are higher by 7.27 and 3.64, and 5.81 and 2.54 percentage points, than the corresponding values achieved by the baseline (TransUNet).</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2024.3368170</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Artificial neural networks ; Biomedical imaging ; Breast biopsy ; Breast cancer ; Breast disease ; breast ultrasound (BUS) ; Computational modeling ; Computer networks ; Convolutional neural networks ; Datasets ; Feature extraction ; Image enhancement ; Image segmentation ; Lesions ; Medical diagnostic imaging ; medical image segmentation ; Medical imaging ; Modules ; Transformers ; TransUNet ; tumor segmentation ; Tumors ; Ultrasonic imaging</subject><ispartof>IEEE access, 2024, Vol.12, p.31182-31196</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c359t-f12f632086154ac684e78e3837f498b606bb027b070ec9aa74fa34a635e0115b3</cites><orcidid>0009-0009-2447-823X ; 0009-0007-4312-3755 ; 0000-0003-0535-7087 ; 0000-0003-3527-3773</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10442999$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2102,4024,27633,27923,27924,27925,54933</link.rule.ids></links><search><creatorcontrib>Ji, Zhanlin</creatorcontrib><creatorcontrib>Sun, Haoran</creatorcontrib><creatorcontrib>Yuan, Na</creatorcontrib><creatorcontrib>Zhang, Haiyang</creatorcontrib><creatorcontrib>Sheng, Jiaxi</creatorcontrib><creatorcontrib>Zhang, Xueji</creatorcontrib><creatorcontrib>Ganchev, Ivan</creatorcontrib><title>BGRD-TransUNet: A Novel TransUNet-Based Model for Ultrasound Breast Lesion Segmentation</title><title>IEEE access</title><addtitle>Access</addtitle><description>Breast UltraSound (BUS) imaging is a commonly used diagnostic tool in the field of counter fighting breast diseases, especially for early detection and diagnosis of breast cancer. Due to the inherent characteristics of ultrasound images such as blurry boundaries and diverse tumor morphologies, it is challenging for doctors to manually segment breast tumors. In recent years, the Convolutional Neural Network (CNN) technology has been widely applied to automatically segment BUS images. However, due to the inherent limitations of CNNs in capturing global contextual information, it is difficult to capture the full context. To address this issue, the paper proposes a novel BGRD-TransUNet model for breast lesion segmentation, based on TransUNet. The proposed model, first, replaces the original ResNet50 backbone network of TransUNet with DenseNet121 for initial feature extraction. Next, newly designed Residual Multi-Scale Feature Modules (RMSFMs) are employed to extract features from various layers of DenseNet121, thus capturing richer features within specific layers. Thirdly, a Boundary Guidance (BG) network is added to enhance the contour information of BUS images. Additionally, newly designed Boundary Attentional Feature Fusion Modules (BAFFMs) are used to integrate edge information and features extracted through RMSFMs. Finally, newly designed Parallel Channel and Spatial Attention Modules (PCSAMs) are used to refine feature extraction using channel and spatial attention. An extensive experimental testing performed on two public datasets demonstrates that the proposed BGRD-TransUNet model outperforms all state-of-the-art medical image segmentation models, participating in the experiments, according to all evaluation metrics used (except for few separate cases), including the two most important and widely used metrics in the field of medical image segmentation, namely the Intersection over Union (IoU) and Dice Similarity Coefficient (DSC). More specifically, on the BUSI dataset and dataset B, BGRD-TransUNet achieves IoU values of 76.77% and 86.61%, and DSC values of 85.08% and 92.47%, respectively, which are higher by 7.27 and 3.64, and 5.81 and 2.54 percentage points, than the corresponding values achieved by the baseline (TransUNet).</description><subject>Artificial neural networks</subject><subject>Biomedical imaging</subject><subject>Breast biopsy</subject><subject>Breast cancer</subject><subject>Breast disease</subject><subject>breast ultrasound (BUS)</subject><subject>Computational modeling</subject><subject>Computer networks</subject><subject>Convolutional neural networks</subject><subject>Datasets</subject><subject>Feature extraction</subject><subject>Image enhancement</subject><subject>Image segmentation</subject><subject>Lesions</subject><subject>Medical diagnostic imaging</subject><subject>medical image segmentation</subject><subject>Medical imaging</subject><subject>Modules</subject><subject>Transformers</subject><subject>TransUNet</subject><subject>tumor segmentation</subject><subject>Tumors</subject><subject>Ultrasonic imaging</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUU1PwzAMrRBIINgvgEMlzh1JnKYNt22MD2kMiTFxjNzOmTqNBpIOiX9PoBPCF9tPfs-WX5KcczbknOmr0WQyXSyGggk5BFAlL9hBciK40hnkoA7_1cfJIIQNi1FGKC9Oktfx3fNN9uKxDcs5ddfpKJ27T9qmf1A2xkCr9NGtImqdT5fbzmNwu3aVjj1h6NIZhca16YLWb9R22MXmLDmyuA002OfTZHk7fZncZ7Onu4fJaJbVkOsus1xYBYKViucSa1VKKkqCEgordVkppqqKiaJiBaNaIxbSIkhUkBPjPK_gNHnodVcON-bdN2_ov4zDxvwCzq8N-q6pt2TIQlULC0iaSc0qrLHK4x9AWSasLKLWZa_17t3HjkJnNm7n23i-ETp-lmkFPE5BP1V7F4In-7eVM_NjiOkNMT-GmL0hkXXRsxoi-seQUmit4Ru3DYTu</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Ji, Zhanlin</creator><creator>Sun, Haoran</creator><creator>Yuan, Na</creator><creator>Zhang, Haiyang</creator><creator>Sheng, Jiaxi</creator><creator>Zhang, Xueji</creator><creator>Ganchev, Ivan</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0009-0009-2447-823X</orcidid><orcidid>https://orcid.org/0009-0007-4312-3755</orcidid><orcidid>https://orcid.org/0000-0003-0535-7087</orcidid><orcidid>https://orcid.org/0000-0003-3527-3773</orcidid></search><sort><creationdate>2024</creationdate><title>BGRD-TransUNet: A Novel TransUNet-Based Model for Ultrasound Breast Lesion Segmentation</title><author>Ji, Zhanlin ; Sun, Haoran ; Yuan, Na ; Zhang, Haiyang ; Sheng, Jiaxi ; Zhang, Xueji ; Ganchev, Ivan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-f12f632086154ac684e78e3837f498b606bb027b070ec9aa74fa34a635e0115b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Artificial neural networks</topic><topic>Biomedical imaging</topic><topic>Breast biopsy</topic><topic>Breast cancer</topic><topic>Breast disease</topic><topic>breast ultrasound (BUS)</topic><topic>Computational modeling</topic><topic>Computer networks</topic><topic>Convolutional neural networks</topic><topic>Datasets</topic><topic>Feature extraction</topic><topic>Image enhancement</topic><topic>Image segmentation</topic><topic>Lesions</topic><topic>Medical diagnostic imaging</topic><topic>medical image segmentation</topic><topic>Medical imaging</topic><topic>Modules</topic><topic>Transformers</topic><topic>TransUNet</topic><topic>tumor segmentation</topic><topic>Tumors</topic><topic>Ultrasonic imaging</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ji, Zhanlin</creatorcontrib><creatorcontrib>Sun, Haoran</creatorcontrib><creatorcontrib>Yuan, Na</creatorcontrib><creatorcontrib>Zhang, Haiyang</creatorcontrib><creatorcontrib>Sheng, Jiaxi</creatorcontrib><creatorcontrib>Zhang, Xueji</creatorcontrib><creatorcontrib>Ganchev, Ivan</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ji, Zhanlin</au><au>Sun, Haoran</au><au>Yuan, Na</au><au>Zhang, Haiyang</au><au>Sheng, Jiaxi</au><au>Zhang, Xueji</au><au>Ganchev, Ivan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>BGRD-TransUNet: A Novel TransUNet-Based Model for Ultrasound Breast Lesion Segmentation</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2024</date><risdate>2024</risdate><volume>12</volume><spage>31182</spage><epage>31196</epage><pages>31182-31196</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Breast UltraSound (BUS) imaging is a commonly used diagnostic tool in the field of counter fighting breast diseases, especially for early detection and diagnosis of breast cancer. Due to the inherent characteristics of ultrasound images such as blurry boundaries and diverse tumor morphologies, it is challenging for doctors to manually segment breast tumors. In recent years, the Convolutional Neural Network (CNN) technology has been widely applied to automatically segment BUS images. However, due to the inherent limitations of CNNs in capturing global contextual information, it is difficult to capture the full context. To address this issue, the paper proposes a novel BGRD-TransUNet model for breast lesion segmentation, based on TransUNet. The proposed model, first, replaces the original ResNet50 backbone network of TransUNet with DenseNet121 for initial feature extraction. Next, newly designed Residual Multi-Scale Feature Modules (RMSFMs) are employed to extract features from various layers of DenseNet121, thus capturing richer features within specific layers. Thirdly, a Boundary Guidance (BG) network is added to enhance the contour information of BUS images. Additionally, newly designed Boundary Attentional Feature Fusion Modules (BAFFMs) are used to integrate edge information and features extracted through RMSFMs. Finally, newly designed Parallel Channel and Spatial Attention Modules (PCSAMs) are used to refine feature extraction using channel and spatial attention. An extensive experimental testing performed on two public datasets demonstrates that the proposed BGRD-TransUNet model outperforms all state-of-the-art medical image segmentation models, participating in the experiments, according to all evaluation metrics used (except for few separate cases), including the two most important and widely used metrics in the field of medical image segmentation, namely the Intersection over Union (IoU) and Dice Similarity Coefficient (DSC). More specifically, on the BUSI dataset and dataset B, BGRD-TransUNet achieves IoU values of 76.77% and 86.61%, and DSC values of 85.08% and 92.47%, respectively, which are higher by 7.27 and 3.64, and 5.81 and 2.54 percentage points, than the corresponding values achieved by the baseline (TransUNet).</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2024.3368170</doi><tpages>15</tpages><orcidid>https://orcid.org/0009-0009-2447-823X</orcidid><orcidid>https://orcid.org/0009-0007-4312-3755</orcidid><orcidid>https://orcid.org/0000-0003-0535-7087</orcidid><orcidid>https://orcid.org/0000-0003-3527-3773</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2024, Vol.12, p.31182-31196
issn 2169-3536
2169-3536
language eng
recordid cdi_ieee_primary_10442999
source IEEE Open Access Journals; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals
subjects Artificial neural networks
Biomedical imaging
Breast biopsy
Breast cancer
Breast disease
breast ultrasound (BUS)
Computational modeling
Computer networks
Convolutional neural networks
Datasets
Feature extraction
Image enhancement
Image segmentation
Lesions
Medical diagnostic imaging
medical image segmentation
Medical imaging
Modules
Transformers
TransUNet
tumor segmentation
Tumors
Ultrasonic imaging
title BGRD-TransUNet: A Novel TransUNet-Based Model for Ultrasound Breast Lesion Segmentation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-22T21%3A04%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=BGRD-TransUNet:%20A%20Novel%20TransUNet-Based%20Model%20for%20Ultrasound%20Breast%20Lesion%20Segmentation&rft.jtitle=IEEE%20access&rft.au=Ji,%20Zhanlin&rft.date=2024&rft.volume=12&rft.spage=31182&rft.epage=31196&rft.pages=31182-31196&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2024.3368170&rft_dat=%3Cproquest_ieee_%3E2933609631%3C/proquest_ieee_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2933609631&rft_id=info:pmid/&rft_ieee_id=10442999&rft_doaj_id=oai_doaj_org_article_ef3bc2f3ae90490bacab581636f02f47&rfr_iscdi=true