Boundary-rendering network for breast lesion segmentation in ultrasound images
•A specialized segmentation model that can address blurry or occluded edges in ultrasound images.•A differentiable boundary selection module that can automatically focus on the marginal area.•A GCN-based boundary rendering module that can incorporate global contour information.•A unified framework t...
Gespeichert in:
Veröffentlicht in: | Medical image analysis 2022-08, Vol.80, p.102478-102478, Article 102478 |
---|---|
Hauptverfasser: | , , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 102478 |
---|---|
container_issue | |
container_start_page | 102478 |
container_title | Medical image analysis |
container_volume | 80 |
creator | Huang, Ruobing Lin, Mingrong Dou, Haoran Lin, Zehui Ying, Qilong Jia, Xiaohong Xu, Wenwen Mei, Zihan Yang, Xin Dong, Yijie Zhou, Jianqiao Ni, Dong |
description | •A specialized segmentation model that can address blurry or occluded edges in ultrasound images.•A differentiable boundary selection module that can automatically focus on the marginal area.•A GCN-based boundary rendering module that can incorporate global contour information.•A unified framework that can perform segmentation and classification simultaneously.
Breast Ultrasound (BUS) has proven to be an effective tool for the early detection of cancer in the breast. A lesion segmentation provides identification of the boundary, shape, and location of the target, and serves as a crucial step toward accurate diagnosis. Despite recent efforts in developing machine learning algorithms to automate this process, problems remain due to the blurry or occluded edges and highly irregular nodule shapes. Existing methods often produce over-smooth or inaccurate results, failing the need of identifying detailed boundary structures which are of clinical interest. To overcome these challenges, we propose a novel boundary-rendering framework that explicitly highlights the importance of boundary for automated nodule segmentation in BUS images. It utilizes a boundary selection module to automatically focuses on the ambiguous boundary region and a graph convolutional-based boundary rendering module to exploit global contour information. Furthermore, the proposed framework embeds nodule classification via semantic segmentation and encourages co-learning across tasks. Validation experiments were performed on different BUS datasets to verify the robustness of the proposed method. Results show that the proposed method outperforms states-of-art segmentation approaches (Dice=0.854, IOU=0.919, HD=17.8) in nodule delineation, as well as obtains a higher classification accuracy than classical classification models.
[Display omitted] |
doi_str_mv | 10.1016/j.media.2022.102478 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2675987456</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S1361841522001256</els_id><sourcerecordid>2675987456</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-1d52d4b16b859e7fa46d475702df59e85bbc9dec2befc9ee9d15ae279417ae9a3</originalsourceid><addsrcrecordid>eNp9kE9PAyEQxYnRWK1-AhOzRy9bgYVlOXjQxn9Joxc9ExZmG-qWrbCr6beX2tqjJ2B4897MD6ELgicEk_J6MVmCdXpCMaWpQpmoDtAJKUqSV4wWh_s74SN0GuMCYywYw8doVPBSEsLYCXq56wZvdVjnAbyF4Pw889B_d-Eja7qQ1QF07LMWout8FmG-BN_rfvNwPhvaPui4ccjcUs8hnqGjRrcRznfnGL0_3L9Nn_LZ6-Pz9HaWm4LLPieWU8tqUtYVlyAazUrLBBeY2iYVKl7XRlowtIbGSABpCddAhWREaJC6GKOrre8qdJ8DxF4tXTTQttpDN0RFS8FlJRgvk7TYSk3oYgzQqFVIw4a1IlhtQKqF-gWpNiDVFmTqutwFDHX63ff8kUuCm60A0ppfDoKKxoE3ySmA6ZXt3L8BPyG1hsE</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2675987456</pqid></control><display><type>article</type><title>Boundary-rendering network for breast lesion segmentation in ultrasound images</title><source>Elsevier ScienceDirect Journals</source><creator>Huang, Ruobing ; Lin, Mingrong ; Dou, Haoran ; Lin, Zehui ; Ying, Qilong ; Jia, Xiaohong ; Xu, Wenwen ; Mei, Zihan ; Yang, Xin ; Dong, Yijie ; Zhou, Jianqiao ; Ni, Dong</creator><creatorcontrib>Huang, Ruobing ; Lin, Mingrong ; Dou, Haoran ; Lin, Zehui ; Ying, Qilong ; Jia, Xiaohong ; Xu, Wenwen ; Mei, Zihan ; Yang, Xin ; Dong, Yijie ; Zhou, Jianqiao ; Ni, Dong</creatorcontrib><description>•A specialized segmentation model that can address blurry or occluded edges in ultrasound images.•A differentiable boundary selection module that can automatically focus on the marginal area.•A GCN-based boundary rendering module that can incorporate global contour information.•A unified framework that can perform segmentation and classification simultaneously.
Breast Ultrasound (BUS) has proven to be an effective tool for the early detection of cancer in the breast. A lesion segmentation provides identification of the boundary, shape, and location of the target, and serves as a crucial step toward accurate diagnosis. Despite recent efforts in developing machine learning algorithms to automate this process, problems remain due to the blurry or occluded edges and highly irregular nodule shapes. Existing methods often produce over-smooth or inaccurate results, failing the need of identifying detailed boundary structures which are of clinical interest. To overcome these challenges, we propose a novel boundary-rendering framework that explicitly highlights the importance of boundary for automated nodule segmentation in BUS images. It utilizes a boundary selection module to automatically focuses on the ambiguous boundary region and a graph convolutional-based boundary rendering module to exploit global contour information. Furthermore, the proposed framework embeds nodule classification via semantic segmentation and encourages co-learning across tasks. Validation experiments were performed on different BUS datasets to verify the robustness of the proposed method. Results show that the proposed method outperforms states-of-art segmentation approaches (Dice=0.854, IOU=0.919, HD=17.8) in nodule delineation, as well as obtains a higher classification accuracy than classical classification models.
[Display omitted]</description><identifier>ISSN: 1361-8415</identifier><identifier>EISSN: 1361-8423</identifier><identifier>DOI: 10.1016/j.media.2022.102478</identifier><identifier>PMID: 35691144</identifier><language>eng</language><publisher>Netherlands: Elsevier B.V</publisher><subject>Breast cancer ; Graph-convolution network ; Segmentation ; Ultrasound</subject><ispartof>Medical image analysis, 2022-08, Vol.80, p.102478-102478, Article 102478</ispartof><rights>2022</rights><rights>Copyright © 2022. Published by Elsevier B.V.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c359t-1d52d4b16b859e7fa46d475702df59e85bbc9dec2befc9ee9d15ae279417ae9a3</citedby><cites>FETCH-LOGICAL-c359t-1d52d4b16b859e7fa46d475702df59e85bbc9dec2befc9ee9d15ae279417ae9a3</cites><orcidid>0000-0002-9146-6003 ; 0000-0001-8628-5489</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S1361841522001256$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35691144$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Huang, Ruobing</creatorcontrib><creatorcontrib>Lin, Mingrong</creatorcontrib><creatorcontrib>Dou, Haoran</creatorcontrib><creatorcontrib>Lin, Zehui</creatorcontrib><creatorcontrib>Ying, Qilong</creatorcontrib><creatorcontrib>Jia, Xiaohong</creatorcontrib><creatorcontrib>Xu, Wenwen</creatorcontrib><creatorcontrib>Mei, Zihan</creatorcontrib><creatorcontrib>Yang, Xin</creatorcontrib><creatorcontrib>Dong, Yijie</creatorcontrib><creatorcontrib>Zhou, Jianqiao</creatorcontrib><creatorcontrib>Ni, Dong</creatorcontrib><title>Boundary-rendering network for breast lesion segmentation in ultrasound images</title><title>Medical image analysis</title><addtitle>Med Image Anal</addtitle><description>•A specialized segmentation model that can address blurry or occluded edges in ultrasound images.•A differentiable boundary selection module that can automatically focus on the marginal area.•A GCN-based boundary rendering module that can incorporate global contour information.•A unified framework that can perform segmentation and classification simultaneously.
Breast Ultrasound (BUS) has proven to be an effective tool for the early detection of cancer in the breast. A lesion segmentation provides identification of the boundary, shape, and location of the target, and serves as a crucial step toward accurate diagnosis. Despite recent efforts in developing machine learning algorithms to automate this process, problems remain due to the blurry or occluded edges and highly irregular nodule shapes. Existing methods often produce over-smooth or inaccurate results, failing the need of identifying detailed boundary structures which are of clinical interest. To overcome these challenges, we propose a novel boundary-rendering framework that explicitly highlights the importance of boundary for automated nodule segmentation in BUS images. It utilizes a boundary selection module to automatically focuses on the ambiguous boundary region and a graph convolutional-based boundary rendering module to exploit global contour information. Furthermore, the proposed framework embeds nodule classification via semantic segmentation and encourages co-learning across tasks. Validation experiments were performed on different BUS datasets to verify the robustness of the proposed method. Results show that the proposed method outperforms states-of-art segmentation approaches (Dice=0.854, IOU=0.919, HD=17.8) in nodule delineation, as well as obtains a higher classification accuracy than classical classification models.
[Display omitted]</description><subject>Breast cancer</subject><subject>Graph-convolution network</subject><subject>Segmentation</subject><subject>Ultrasound</subject><issn>1361-8415</issn><issn>1361-8423</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kE9PAyEQxYnRWK1-AhOzRy9bgYVlOXjQxn9Joxc9ExZmG-qWrbCr6beX2tqjJ2B4897MD6ELgicEk_J6MVmCdXpCMaWpQpmoDtAJKUqSV4wWh_s74SN0GuMCYywYw8doVPBSEsLYCXq56wZvdVjnAbyF4Pw889B_d-Eja7qQ1QF07LMWout8FmG-BN_rfvNwPhvaPui4ccjcUs8hnqGjRrcRznfnGL0_3L9Nn_LZ6-Pz9HaWm4LLPieWU8tqUtYVlyAazUrLBBeY2iYVKl7XRlowtIbGSABpCddAhWREaJC6GKOrre8qdJ8DxF4tXTTQttpDN0RFS8FlJRgvk7TYSk3oYgzQqFVIw4a1IlhtQKqF-gWpNiDVFmTqutwFDHX63ff8kUuCm60A0ppfDoKKxoE3ySmA6ZXt3L8BPyG1hsE</recordid><startdate>20220801</startdate><enddate>20220801</enddate><creator>Huang, Ruobing</creator><creator>Lin, Mingrong</creator><creator>Dou, Haoran</creator><creator>Lin, Zehui</creator><creator>Ying, Qilong</creator><creator>Jia, Xiaohong</creator><creator>Xu, Wenwen</creator><creator>Mei, Zihan</creator><creator>Yang, Xin</creator><creator>Dong, Yijie</creator><creator>Zhou, Jianqiao</creator><creator>Ni, Dong</creator><general>Elsevier B.V</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-9146-6003</orcidid><orcidid>https://orcid.org/0000-0001-8628-5489</orcidid></search><sort><creationdate>20220801</creationdate><title>Boundary-rendering network for breast lesion segmentation in ultrasound images</title><author>Huang, Ruobing ; Lin, Mingrong ; Dou, Haoran ; Lin, Zehui ; Ying, Qilong ; Jia, Xiaohong ; Xu, Wenwen ; Mei, Zihan ; Yang, Xin ; Dong, Yijie ; Zhou, Jianqiao ; Ni, Dong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-1d52d4b16b859e7fa46d475702df59e85bbc9dec2befc9ee9d15ae279417ae9a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Breast cancer</topic><topic>Graph-convolution network</topic><topic>Segmentation</topic><topic>Ultrasound</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Huang, Ruobing</creatorcontrib><creatorcontrib>Lin, Mingrong</creatorcontrib><creatorcontrib>Dou, Haoran</creatorcontrib><creatorcontrib>Lin, Zehui</creatorcontrib><creatorcontrib>Ying, Qilong</creatorcontrib><creatorcontrib>Jia, Xiaohong</creatorcontrib><creatorcontrib>Xu, Wenwen</creatorcontrib><creatorcontrib>Mei, Zihan</creatorcontrib><creatorcontrib>Yang, Xin</creatorcontrib><creatorcontrib>Dong, Yijie</creatorcontrib><creatorcontrib>Zhou, Jianqiao</creatorcontrib><creatorcontrib>Ni, Dong</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Medical image analysis</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Huang, Ruobing</au><au>Lin, Mingrong</au><au>Dou, Haoran</au><au>Lin, Zehui</au><au>Ying, Qilong</au><au>Jia, Xiaohong</au><au>Xu, Wenwen</au><au>Mei, Zihan</au><au>Yang, Xin</au><au>Dong, Yijie</au><au>Zhou, Jianqiao</au><au>Ni, Dong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Boundary-rendering network for breast lesion segmentation in ultrasound images</atitle><jtitle>Medical image analysis</jtitle><addtitle>Med Image Anal</addtitle><date>2022-08-01</date><risdate>2022</risdate><volume>80</volume><spage>102478</spage><epage>102478</epage><pages>102478-102478</pages><artnum>102478</artnum><issn>1361-8415</issn><eissn>1361-8423</eissn><abstract>•A specialized segmentation model that can address blurry or occluded edges in ultrasound images.•A differentiable boundary selection module that can automatically focus on the marginal area.•A GCN-based boundary rendering module that can incorporate global contour information.•A unified framework that can perform segmentation and classification simultaneously.
Breast Ultrasound (BUS) has proven to be an effective tool for the early detection of cancer in the breast. A lesion segmentation provides identification of the boundary, shape, and location of the target, and serves as a crucial step toward accurate diagnosis. Despite recent efforts in developing machine learning algorithms to automate this process, problems remain due to the blurry or occluded edges and highly irregular nodule shapes. Existing methods often produce over-smooth or inaccurate results, failing the need of identifying detailed boundary structures which are of clinical interest. To overcome these challenges, we propose a novel boundary-rendering framework that explicitly highlights the importance of boundary for automated nodule segmentation in BUS images. It utilizes a boundary selection module to automatically focuses on the ambiguous boundary region and a graph convolutional-based boundary rendering module to exploit global contour information. Furthermore, the proposed framework embeds nodule classification via semantic segmentation and encourages co-learning across tasks. Validation experiments were performed on different BUS datasets to verify the robustness of the proposed method. Results show that the proposed method outperforms states-of-art segmentation approaches (Dice=0.854, IOU=0.919, HD=17.8) in nodule delineation, as well as obtains a higher classification accuracy than classical classification models.
[Display omitted]</abstract><cop>Netherlands</cop><pub>Elsevier B.V</pub><pmid>35691144</pmid><doi>10.1016/j.media.2022.102478</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-9146-6003</orcidid><orcidid>https://orcid.org/0000-0001-8628-5489</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1361-8415 |
ispartof | Medical image analysis, 2022-08, Vol.80, p.102478-102478, Article 102478 |
issn | 1361-8415 1361-8423 |
language | eng |
recordid | cdi_proquest_miscellaneous_2675987456 |
source | Elsevier ScienceDirect Journals |
subjects | Breast cancer Graph-convolution network Segmentation Ultrasound |
title | Boundary-rendering network for breast lesion segmentation in ultrasound images |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T16%3A16%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Boundary-rendering%20network%20for%20breast%20lesion%20segmentation%20in%20ultrasound%20images&rft.jtitle=Medical%20image%20analysis&rft.au=Huang,%20Ruobing&rft.date=2022-08-01&rft.volume=80&rft.spage=102478&rft.epage=102478&rft.pages=102478-102478&rft.artnum=102478&rft.issn=1361-8415&rft.eissn=1361-8423&rft_id=info:doi/10.1016/j.media.2022.102478&rft_dat=%3Cproquest_cross%3E2675987456%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2675987456&rft_id=info:pmid/35691144&rft_els_id=S1361841522001256&rfr_iscdi=true |