Accurate segmentation of breast tumor in ultrasound images through joint training and refined segmentation
Objective. This paper proposes an automatic breast tumor segmentation method for two-dimensional (2D) ultrasound images, which is significantly more accurate, robust, and adaptable than common deep learning models on small datasets. Approach. A generalized joint training and refined segmentation fra...
Gespeichert in:
Veröffentlicht in: | Physics in medicine & biology 2022-09, Vol.67 (17), p.175013 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 17 |
container_start_page | 175013 |
container_title | Physics in medicine & biology |
container_volume | 67 |
creator | Shen, Xiaoyan Wu, Xinran Liu, Ruibo Li, Hong Yin, Jiandong Wang, Liangyu Ma, He |
description | Objective.
This paper proposes an automatic breast tumor segmentation method for two-dimensional (2D) ultrasound images, which is significantly more accurate, robust, and adaptable than common deep learning models on small datasets.
Approach.
A generalized joint training and refined segmentation framework (JR) was established, involving a joint training module (
J
module
) and a refined segmentation module (
R
module
). In
J
module
, two segmentation networks are trained simultaneously, under the guidance of the proposed Jocor for Segmentation (JFS) algorithm. In
R
module
, the output of
J
module
is refined by the proposed area first (AF) algorithm, and marked watershed (MW) algorithm. The AF mainly reduces false positives, which arise easily from the inherent features of breast ultrasound images, in the light of the area, distance, average radical derivative (ARD) and radical gradient index (RGI) of candidate contours. Meanwhile, the MW avoids over-segmentation, and refines segmentation results. To verify its performance, the JR framework was evaluated on three breast ultrasound image datasets. Image dataset A contains 1036 images from local hospitals. Image datasets B and C are two public datasets, containing 562 images and 163 images, respectively. The evaluation was followed by related ablation experiments.
Main results.
The JR outperformed the other state-of-the-art (SOTA) methods on the three image datasets, especially on image dataset B. Compared with the SOTA methods, the JR improved true positive ratio (TPR) and Jaccard index (JI) by 1.5% and 3.2%, respectively, and reduces (false positive ratio) FPR by 3.7% on image dataset B. The results of the ablation experiments show that each component of the JR matters, and contributes to the segmentation accuracy, particularly in the reduction of false positives.
Significance.
This study successfully combines traditional segmentation methods with deep learning models. The proposed method can segment small-scale breast ultrasound image datasets efficiently and effectively, with excellent generalization performance. |
doi_str_mv | 10.1088/1361-6560/ac8964 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1088_1361_6560_ac8964</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2702183835</sourcerecordid><originalsourceid>FETCH-LOGICAL-c346t-23d560d9822e0be86c88ca8e1356c3c992eabf778008cf571edfdba49f9600d3</originalsourceid><addsrcrecordid>eNp1kEtLxDAURoMoOI7uXWbpwjpJ06bpchh8wYCb2Yc0j05Km9QkXfjvbRlRBIULFy6Hj-8eAG4xesCIsQ0mFGe0pGgjJKtpcQZW36dzsEKI4KzGZXkJrmLsEMKY5cUKdFsppyCShlG3g3ZJJOsd9AY2QYuYYJoGH6B1cOpTENFPTkE7iFZHmI7BT-0Rdt66GQzCOutaKGYiaGOdVr9Cr8GFEX3UN197DQ5Pj4fdS7Z_e37dbfeZJAVNWU7UXFnVLM81ajSjkjEpmMakpJLIus61aExVMYSYNGWFtTKqEUVtaoqQImtwd4odg3-fdEx8sFHqvhdO-ynyvEI5ZoSRckbRCZXBxzh35mOYfwsfHCO-WOWLQr4o5CerP-nWj7zzU3DzK3wcGk4rjpcpESZ8VGZG7_9A_03-BBffiBQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2702183835</pqid></control><display><type>article</type><title>Accurate segmentation of breast tumor in ultrasound images through joint training and refined segmentation</title><source>IOP Publishing Journals</source><source>Institute of Physics (IOP) Journals - HEAL-Link</source><creator>Shen, Xiaoyan ; Wu, Xinran ; Liu, Ruibo ; Li, Hong ; Yin, Jiandong ; Wang, Liangyu ; Ma, He</creator><creatorcontrib>Shen, Xiaoyan ; Wu, Xinran ; Liu, Ruibo ; Li, Hong ; Yin, Jiandong ; Wang, Liangyu ; Ma, He</creatorcontrib><description>Objective.
This paper proposes an automatic breast tumor segmentation method for two-dimensional (2D) ultrasound images, which is significantly more accurate, robust, and adaptable than common deep learning models on small datasets.
Approach.
A generalized joint training and refined segmentation framework (JR) was established, involving a joint training module (
J
module
) and a refined segmentation module (
R
module
). In
J
module
, two segmentation networks are trained simultaneously, under the guidance of the proposed Jocor for Segmentation (JFS) algorithm. In
R
module
, the output of
J
module
is refined by the proposed area first (AF) algorithm, and marked watershed (MW) algorithm. The AF mainly reduces false positives, which arise easily from the inherent features of breast ultrasound images, in the light of the area, distance, average radical derivative (ARD) and radical gradient index (RGI) of candidate contours. Meanwhile, the MW avoids over-segmentation, and refines segmentation results. To verify its performance, the JR framework was evaluated on three breast ultrasound image datasets. Image dataset A contains 1036 images from local hospitals. Image datasets B and C are two public datasets, containing 562 images and 163 images, respectively. The evaluation was followed by related ablation experiments.
Main results.
The JR outperformed the other state-of-the-art (SOTA) methods on the three image datasets, especially on image dataset B. Compared with the SOTA methods, the JR improved true positive ratio (TPR) and Jaccard index (JI) by 1.5% and 3.2%, respectively, and reduces (false positive ratio) FPR by 3.7% on image dataset B. The results of the ablation experiments show that each component of the JR matters, and contributes to the segmentation accuracy, particularly in the reduction of false positives.
Significance.
This study successfully combines traditional segmentation methods with deep learning models. The proposed method can segment small-scale breast ultrasound image datasets efficiently and effectively, with excellent generalization performance.</description><identifier>ISSN: 0031-9155</identifier><identifier>EISSN: 1361-6560</identifier><identifier>DOI: 10.1088/1361-6560/ac8964</identifier><identifier>CODEN: PHMBA7</identifier><language>eng</language><publisher>IOP Publishing</publisher><subject>breast ultrasound ; deep learning ; joint training ; segmentation ; watershed</subject><ispartof>Physics in medicine & biology, 2022-09, Vol.67 (17), p.175013</ispartof><rights>2022 Institute of Physics and Engineering in Medicine</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c346t-23d560d9822e0be86c88ca8e1356c3c992eabf778008cf571edfdba49f9600d3</citedby><cites>FETCH-LOGICAL-c346t-23d560d9822e0be86c88ca8e1356c3c992eabf778008cf571edfdba49f9600d3</cites><orcidid>0000-0002-5054-3586 ; 0000-0003-4008-7991</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://iopscience.iop.org/article/10.1088/1361-6560/ac8964/pdf$$EPDF$$P50$$Giop$$H</linktopdf><link.rule.ids>314,776,780,27901,27902,53821,53868</link.rule.ids></links><search><creatorcontrib>Shen, Xiaoyan</creatorcontrib><creatorcontrib>Wu, Xinran</creatorcontrib><creatorcontrib>Liu, Ruibo</creatorcontrib><creatorcontrib>Li, Hong</creatorcontrib><creatorcontrib>Yin, Jiandong</creatorcontrib><creatorcontrib>Wang, Liangyu</creatorcontrib><creatorcontrib>Ma, He</creatorcontrib><title>Accurate segmentation of breast tumor in ultrasound images through joint training and refined segmentation</title><title>Physics in medicine & biology</title><addtitle>PMB</addtitle><addtitle>Phys. Med. Biol</addtitle><description>Objective.
This paper proposes an automatic breast tumor segmentation method for two-dimensional (2D) ultrasound images, which is significantly more accurate, robust, and adaptable than common deep learning models on small datasets.
Approach.
A generalized joint training and refined segmentation framework (JR) was established, involving a joint training module (
J
module
) and a refined segmentation module (
R
module
). In
J
module
, two segmentation networks are trained simultaneously, under the guidance of the proposed Jocor for Segmentation (JFS) algorithm. In
R
module
, the output of
J
module
is refined by the proposed area first (AF) algorithm, and marked watershed (MW) algorithm. The AF mainly reduces false positives, which arise easily from the inherent features of breast ultrasound images, in the light of the area, distance, average radical derivative (ARD) and radical gradient index (RGI) of candidate contours. Meanwhile, the MW avoids over-segmentation, and refines segmentation results. To verify its performance, the JR framework was evaluated on three breast ultrasound image datasets. Image dataset A contains 1036 images from local hospitals. Image datasets B and C are two public datasets, containing 562 images and 163 images, respectively. The evaluation was followed by related ablation experiments.
Main results.
The JR outperformed the other state-of-the-art (SOTA) methods on the three image datasets, especially on image dataset B. Compared with the SOTA methods, the JR improved true positive ratio (TPR) and Jaccard index (JI) by 1.5% and 3.2%, respectively, and reduces (false positive ratio) FPR by 3.7% on image dataset B. The results of the ablation experiments show that each component of the JR matters, and contributes to the segmentation accuracy, particularly in the reduction of false positives.
Significance.
This study successfully combines traditional segmentation methods with deep learning models. The proposed method can segment small-scale breast ultrasound image datasets efficiently and effectively, with excellent generalization performance.</description><subject>breast ultrasound</subject><subject>deep learning</subject><subject>joint training</subject><subject>segmentation</subject><subject>watershed</subject><issn>0031-9155</issn><issn>1361-6560</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp1kEtLxDAURoMoOI7uXWbpwjpJ06bpchh8wYCb2Yc0j05Km9QkXfjvbRlRBIULFy6Hj-8eAG4xesCIsQ0mFGe0pGgjJKtpcQZW36dzsEKI4KzGZXkJrmLsEMKY5cUKdFsppyCShlG3g3ZJJOsd9AY2QYuYYJoGH6B1cOpTENFPTkE7iFZHmI7BT-0Rdt66GQzCOutaKGYiaGOdVr9Cr8GFEX3UN197DQ5Pj4fdS7Z_e37dbfeZJAVNWU7UXFnVLM81ajSjkjEpmMakpJLIus61aExVMYSYNGWFtTKqEUVtaoqQImtwd4odg3-fdEx8sFHqvhdO-ynyvEI5ZoSRckbRCZXBxzh35mOYfwsfHCO-WOWLQr4o5CerP-nWj7zzU3DzK3wcGk4rjpcpESZ8VGZG7_9A_03-BBffiBQ</recordid><startdate>20220907</startdate><enddate>20220907</enddate><creator>Shen, Xiaoyan</creator><creator>Wu, Xinran</creator><creator>Liu, Ruibo</creator><creator>Li, Hong</creator><creator>Yin, Jiandong</creator><creator>Wang, Liangyu</creator><creator>Ma, He</creator><general>IOP Publishing</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-5054-3586</orcidid><orcidid>https://orcid.org/0000-0003-4008-7991</orcidid></search><sort><creationdate>20220907</creationdate><title>Accurate segmentation of breast tumor in ultrasound images through joint training and refined segmentation</title><author>Shen, Xiaoyan ; Wu, Xinran ; Liu, Ruibo ; Li, Hong ; Yin, Jiandong ; Wang, Liangyu ; Ma, He</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c346t-23d560d9822e0be86c88ca8e1356c3c992eabf778008cf571edfdba49f9600d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>breast ultrasound</topic><topic>deep learning</topic><topic>joint training</topic><topic>segmentation</topic><topic>watershed</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Shen, Xiaoyan</creatorcontrib><creatorcontrib>Wu, Xinran</creatorcontrib><creatorcontrib>Liu, Ruibo</creatorcontrib><creatorcontrib>Li, Hong</creatorcontrib><creatorcontrib>Yin, Jiandong</creatorcontrib><creatorcontrib>Wang, Liangyu</creatorcontrib><creatorcontrib>Ma, He</creatorcontrib><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Physics in medicine & biology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Shen, Xiaoyan</au><au>Wu, Xinran</au><au>Liu, Ruibo</au><au>Li, Hong</au><au>Yin, Jiandong</au><au>Wang, Liangyu</au><au>Ma, He</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Accurate segmentation of breast tumor in ultrasound images through joint training and refined segmentation</atitle><jtitle>Physics in medicine & biology</jtitle><stitle>PMB</stitle><addtitle>Phys. Med. Biol</addtitle><date>2022-09-07</date><risdate>2022</risdate><volume>67</volume><issue>17</issue><spage>175013</spage><pages>175013-</pages><issn>0031-9155</issn><eissn>1361-6560</eissn><coden>PHMBA7</coden><abstract>Objective.
This paper proposes an automatic breast tumor segmentation method for two-dimensional (2D) ultrasound images, which is significantly more accurate, robust, and adaptable than common deep learning models on small datasets.
Approach.
A generalized joint training and refined segmentation framework (JR) was established, involving a joint training module (
J
module
) and a refined segmentation module (
R
module
). In
J
module
, two segmentation networks are trained simultaneously, under the guidance of the proposed Jocor for Segmentation (JFS) algorithm. In
R
module
, the output of
J
module
is refined by the proposed area first (AF) algorithm, and marked watershed (MW) algorithm. The AF mainly reduces false positives, which arise easily from the inherent features of breast ultrasound images, in the light of the area, distance, average radical derivative (ARD) and radical gradient index (RGI) of candidate contours. Meanwhile, the MW avoids over-segmentation, and refines segmentation results. To verify its performance, the JR framework was evaluated on three breast ultrasound image datasets. Image dataset A contains 1036 images from local hospitals. Image datasets B and C are two public datasets, containing 562 images and 163 images, respectively. The evaluation was followed by related ablation experiments.
Main results.
The JR outperformed the other state-of-the-art (SOTA) methods on the three image datasets, especially on image dataset B. Compared with the SOTA methods, the JR improved true positive ratio (TPR) and Jaccard index (JI) by 1.5% and 3.2%, respectively, and reduces (false positive ratio) FPR by 3.7% on image dataset B. The results of the ablation experiments show that each component of the JR matters, and contributes to the segmentation accuracy, particularly in the reduction of false positives.
Significance.
This study successfully combines traditional segmentation methods with deep learning models. The proposed method can segment small-scale breast ultrasound image datasets efficiently and effectively, with excellent generalization performance.</abstract><pub>IOP Publishing</pub><doi>10.1088/1361-6560/ac8964</doi><tpages>21</tpages><orcidid>https://orcid.org/0000-0002-5054-3586</orcidid><orcidid>https://orcid.org/0000-0003-4008-7991</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0031-9155 |
ispartof | Physics in medicine & biology, 2022-09, Vol.67 (17), p.175013 |
issn | 0031-9155 1361-6560 |
language | eng |
recordid | cdi_crossref_primary_10_1088_1361_6560_ac8964 |
source | IOP Publishing Journals; Institute of Physics (IOP) Journals - HEAL-Link |
subjects | breast ultrasound deep learning joint training segmentation watershed |
title | Accurate segmentation of breast tumor in ultrasound images through joint training and refined segmentation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-11T06%3A40%3A42IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Accurate%20segmentation%20of%20breast%20tumor%20in%20ultrasound%20images%20through%20joint%20training%20and%20refined%20segmentation&rft.jtitle=Physics%20in%20medicine%20&%20biology&rft.au=Shen,%20Xiaoyan&rft.date=2022-09-07&rft.volume=67&rft.issue=17&rft.spage=175013&rft.pages=175013-&rft.issn=0031-9155&rft.eissn=1361-6560&rft.coden=PHMBA7&rft_id=info:doi/10.1088/1361-6560/ac8964&rft_dat=%3Cproquest_cross%3E2702183835%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2702183835&rft_id=info:pmid/&rfr_iscdi=true |