Interpretable diagnosis of breast lesions in ultrasound imaging using deep multi-stage reasoning

Ultrasound is the primary screening test for breast cancer. However, providing an interpretable auxiliary diagnosis of breast lesions is a challenging task. This study aims to develop an interpretable auxiliary diagnostic method to enhance usability in human-machine collaborative diagnosis. To addre...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Physics in medicine & biology 2024-10, Vol.69 (21), p.215025
Hauptverfasser: Cui, Kaixuan, Liu, Weiyong, Wang, Dongyue
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 21
container_start_page 215025
container_title Physics in medicine & biology
container_volume 69
creator Cui, Kaixuan
Liu, Weiyong
Wang, Dongyue
description Ultrasound is the primary screening test for breast cancer. However, providing an interpretable auxiliary diagnosis of breast lesions is a challenging task. This study aims to develop an interpretable auxiliary diagnostic method to enhance usability in human-machine collaborative diagnosis. To address this issue, this study proposes the deep multi-stage reasoning method (DMSRM), which provides individual and overall breast imaging-reporting and data system (BI-RADS) assessment categories for breast lesions. In the first stage of the DMSRM, the individual BI-RADS assessment network (IBRANet) is designed to capture lesion features from breast ultrasound images. IBRANet performs individual BI-RADS assessments of breast lesions using ultrasound images, focusing on specific features such as margin, contour, echogenicity, calcification, and vascularity. In the second stage, evidence reasoning (ER) is employed to achieve uncertain information fusion and reach an overall BI-RADS assessment of the breast lesions. To evaluate the performance of DMSRM at each stage, two test sets are utilized: the first for individual BI-RADS assessment, containing 4322 ultrasound images; the second for overall BI-RADS assessment, containing 175 sets of ultrasound image pairs. In the individual BI-RADS assessment of margin, contour, echogenicity, calcification, and vascularity, IBRANet achieves accuracies of 0.9491, 0.9466, 0.9293, 0.9234, and 0.9625, respectively. In the overall BI-RADS assessment of lesions, the ER achieves an accuracy of 0.8502. Compared to independent diagnosis, the human-machine collaborative diagnosis results of three radiologists show increases in positive predictive value by 0.0158, 0.0427, and 0.0401, in sensitivity by 0.0400, 0.0600 and 0.0434, and in area under the curve by 0.0344, 0.0468, and 0.0255. This study proposes a DMSRM that enhances the transparency of the diagnostic reasoning process. Results indicate that DMSRM exhibits robust BI-RADS assessment capabilities and provides an interpretable reasoning process that better suits clinical needs.
doi_str_mv 10.1088/1361-6560/ad869f
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_3116675721</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3116675721</sourcerecordid><originalsourceid>FETCH-LOGICAL-c219t-6d3efd4fe53c4e9948b956c312202ce745c73c620d140910966f26d7f0c31a9d3</originalsourceid><addsrcrecordid>eNp1kD1PwzAQhi0EglLYmZBHBkJ9duLEI6r4qFSJBWbjxpfKVWoHOxn49yQqdGPxSb7nfaV7CLkB9gCsqhYgJGSykGxhbCVVc0Jmx69TMmNMQKagKC7IZUo7xgAqnp-TC6FyBgUvZ-Rz5XuMXcTebFqk1pmtD8klGhq6iWhST1tMLvhEnadD20eTwuAtdXuzdX5LhzS9FrGj-3HtstSbLdIpGvy4uiJnjWkTXv_OOfl4fnpfvmbrt5fV8nGd1RxUn0krsLF5g4Woc1QqrzaqkLUAzhmvscyLuhS15MxCzhQwJWXDpS0bNjJGWTEnd4feLoavAVOv9y7V2LbGYxiSFgBSlkXJYUTZAa1jSClio7s4nhO_NTA9edWTRD1J1AevY-T2t33Y7NEeA38iR-D-ALjQ6V0Yoh-P_b_vB7yTgjw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3116675721</pqid></control><display><type>article</type><title>Interpretable diagnosis of breast lesions in ultrasound imaging using deep multi-stage reasoning</title><source>MEDLINE</source><source>Institute of Physics Journals</source><creator>Cui, Kaixuan ; Liu, Weiyong ; Wang, Dongyue</creator><creatorcontrib>Cui, Kaixuan ; Liu, Weiyong ; Wang, Dongyue</creatorcontrib><description>Ultrasound is the primary screening test for breast cancer. However, providing an interpretable auxiliary diagnosis of breast lesions is a challenging task. This study aims to develop an interpretable auxiliary diagnostic method to enhance usability in human-machine collaborative diagnosis. To address this issue, this study proposes the deep multi-stage reasoning method (DMSRM), which provides individual and overall breast imaging-reporting and data system (BI-RADS) assessment categories for breast lesions. In the first stage of the DMSRM, the individual BI-RADS assessment network (IBRANet) is designed to capture lesion features from breast ultrasound images. IBRANet performs individual BI-RADS assessments of breast lesions using ultrasound images, focusing on specific features such as margin, contour, echogenicity, calcification, and vascularity. In the second stage, evidence reasoning (ER) is employed to achieve uncertain information fusion and reach an overall BI-RADS assessment of the breast lesions. To evaluate the performance of DMSRM at each stage, two test sets are utilized: the first for individual BI-RADS assessment, containing 4322 ultrasound images; the second for overall BI-RADS assessment, containing 175 sets of ultrasound image pairs. In the individual BI-RADS assessment of margin, contour, echogenicity, calcification, and vascularity, IBRANet achieves accuracies of 0.9491, 0.9466, 0.9293, 0.9234, and 0.9625, respectively. In the overall BI-RADS assessment of lesions, the ER achieves an accuracy of 0.8502. Compared to independent diagnosis, the human-machine collaborative diagnosis results of three radiologists show increases in positive predictive value by 0.0158, 0.0427, and 0.0401, in sensitivity by 0.0400, 0.0600 and 0.0434, and in area under the curve by 0.0344, 0.0468, and 0.0255. This study proposes a DMSRM that enhances the transparency of the diagnostic reasoning process. Results indicate that DMSRM exhibits robust BI-RADS assessment capabilities and provides an interpretable reasoning process that better suits clinical needs.</description><identifier>ISSN: 0031-9155</identifier><identifier>ISSN: 1361-6560</identifier><identifier>EISSN: 1361-6560</identifier><identifier>DOI: 10.1088/1361-6560/ad869f</identifier><identifier>PMID: 39401527</identifier><identifier>CODEN: PHMBA7</identifier><language>eng</language><publisher>England: IOP Publishing</publisher><subject>auxiliary diagnosis ; BI-RADS assessment ; Breast Neoplasms - diagnostic imaging ; breast ultrasound ; Deep Learning ; Female ; Humans ; Image Interpretation, Computer-Assisted - methods ; multi-stage reasoning ; Ultrasonography - methods ; Ultrasonography, Mammary - methods</subject><ispartof>Physics in medicine &amp; biology, 2024-10, Vol.69 (21), p.215025</ispartof><rights>2024 Institute of Physics and Engineering in Medicine. All rights, including for text and data mining, AI training, and similar technologies, are reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c219t-6d3efd4fe53c4e9948b956c312202ce745c73c620d140910966f26d7f0c31a9d3</cites><orcidid>0009-0005-1770-8894 ; 0009-0007-4202-574X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://iopscience.iop.org/article/10.1088/1361-6560/ad869f/pdf$$EPDF$$P50$$Giop$$H</linktopdf><link.rule.ids>314,780,784,27923,27924,53845,53892</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39401527$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Cui, Kaixuan</creatorcontrib><creatorcontrib>Liu, Weiyong</creatorcontrib><creatorcontrib>Wang, Dongyue</creatorcontrib><title>Interpretable diagnosis of breast lesions in ultrasound imaging using deep multi-stage reasoning</title><title>Physics in medicine &amp; biology</title><addtitle>PMB</addtitle><addtitle>Phys. Med. Biol</addtitle><description>Ultrasound is the primary screening test for breast cancer. However, providing an interpretable auxiliary diagnosis of breast lesions is a challenging task. This study aims to develop an interpretable auxiliary diagnostic method to enhance usability in human-machine collaborative diagnosis. To address this issue, this study proposes the deep multi-stage reasoning method (DMSRM), which provides individual and overall breast imaging-reporting and data system (BI-RADS) assessment categories for breast lesions. In the first stage of the DMSRM, the individual BI-RADS assessment network (IBRANet) is designed to capture lesion features from breast ultrasound images. IBRANet performs individual BI-RADS assessments of breast lesions using ultrasound images, focusing on specific features such as margin, contour, echogenicity, calcification, and vascularity. In the second stage, evidence reasoning (ER) is employed to achieve uncertain information fusion and reach an overall BI-RADS assessment of the breast lesions. To evaluate the performance of DMSRM at each stage, two test sets are utilized: the first for individual BI-RADS assessment, containing 4322 ultrasound images; the second for overall BI-RADS assessment, containing 175 sets of ultrasound image pairs. In the individual BI-RADS assessment of margin, contour, echogenicity, calcification, and vascularity, IBRANet achieves accuracies of 0.9491, 0.9466, 0.9293, 0.9234, and 0.9625, respectively. In the overall BI-RADS assessment of lesions, the ER achieves an accuracy of 0.8502. Compared to independent diagnosis, the human-machine collaborative diagnosis results of three radiologists show increases in positive predictive value by 0.0158, 0.0427, and 0.0401, in sensitivity by 0.0400, 0.0600 and 0.0434, and in area under the curve by 0.0344, 0.0468, and 0.0255. This study proposes a DMSRM that enhances the transparency of the diagnostic reasoning process. Results indicate that DMSRM exhibits robust BI-RADS assessment capabilities and provides an interpretable reasoning process that better suits clinical needs.</description><subject>auxiliary diagnosis</subject><subject>BI-RADS assessment</subject><subject>Breast Neoplasms - diagnostic imaging</subject><subject>breast ultrasound</subject><subject>Deep Learning</subject><subject>Female</subject><subject>Humans</subject><subject>Image Interpretation, Computer-Assisted - methods</subject><subject>multi-stage reasoning</subject><subject>Ultrasonography - methods</subject><subject>Ultrasonography, Mammary - methods</subject><issn>0031-9155</issn><issn>1361-6560</issn><issn>1361-6560</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp1kD1PwzAQhi0EglLYmZBHBkJ9duLEI6r4qFSJBWbjxpfKVWoHOxn49yQqdGPxSb7nfaV7CLkB9gCsqhYgJGSykGxhbCVVc0Jmx69TMmNMQKagKC7IZUo7xgAqnp-TC6FyBgUvZ-Rz5XuMXcTebFqk1pmtD8klGhq6iWhST1tMLvhEnadD20eTwuAtdXuzdX5LhzS9FrGj-3HtstSbLdIpGvy4uiJnjWkTXv_OOfl4fnpfvmbrt5fV8nGd1RxUn0krsLF5g4Woc1QqrzaqkLUAzhmvscyLuhS15MxCzhQwJWXDpS0bNjJGWTEnd4feLoavAVOv9y7V2LbGYxiSFgBSlkXJYUTZAa1jSClio7s4nhO_NTA9edWTRD1J1AevY-T2t33Y7NEeA38iR-D-ALjQ6V0Yoh-P_b_vB7yTgjw</recordid><startdate>20241024</startdate><enddate>20241024</enddate><creator>Cui, Kaixuan</creator><creator>Liu, Weiyong</creator><creator>Wang, Dongyue</creator><general>IOP Publishing</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0009-0005-1770-8894</orcidid><orcidid>https://orcid.org/0009-0007-4202-574X</orcidid></search><sort><creationdate>20241024</creationdate><title>Interpretable diagnosis of breast lesions in ultrasound imaging using deep multi-stage reasoning</title><author>Cui, Kaixuan ; Liu, Weiyong ; Wang, Dongyue</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c219t-6d3efd4fe53c4e9948b956c312202ce745c73c620d140910966f26d7f0c31a9d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>auxiliary diagnosis</topic><topic>BI-RADS assessment</topic><topic>Breast Neoplasms - diagnostic imaging</topic><topic>breast ultrasound</topic><topic>Deep Learning</topic><topic>Female</topic><topic>Humans</topic><topic>Image Interpretation, Computer-Assisted - methods</topic><topic>multi-stage reasoning</topic><topic>Ultrasonography - methods</topic><topic>Ultrasonography, Mammary - methods</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Cui, Kaixuan</creatorcontrib><creatorcontrib>Liu, Weiyong</creatorcontrib><creatorcontrib>Wang, Dongyue</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Physics in medicine &amp; biology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cui, Kaixuan</au><au>Liu, Weiyong</au><au>Wang, Dongyue</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Interpretable diagnosis of breast lesions in ultrasound imaging using deep multi-stage reasoning</atitle><jtitle>Physics in medicine &amp; biology</jtitle><stitle>PMB</stitle><addtitle>Phys. Med. Biol</addtitle><date>2024-10-24</date><risdate>2024</risdate><volume>69</volume><issue>21</issue><spage>215025</spage><pages>215025-</pages><issn>0031-9155</issn><issn>1361-6560</issn><eissn>1361-6560</eissn><coden>PHMBA7</coden><abstract>Ultrasound is the primary screening test for breast cancer. However, providing an interpretable auxiliary diagnosis of breast lesions is a challenging task. This study aims to develop an interpretable auxiliary diagnostic method to enhance usability in human-machine collaborative diagnosis. To address this issue, this study proposes the deep multi-stage reasoning method (DMSRM), which provides individual and overall breast imaging-reporting and data system (BI-RADS) assessment categories for breast lesions. In the first stage of the DMSRM, the individual BI-RADS assessment network (IBRANet) is designed to capture lesion features from breast ultrasound images. IBRANet performs individual BI-RADS assessments of breast lesions using ultrasound images, focusing on specific features such as margin, contour, echogenicity, calcification, and vascularity. In the second stage, evidence reasoning (ER) is employed to achieve uncertain information fusion and reach an overall BI-RADS assessment of the breast lesions. To evaluate the performance of DMSRM at each stage, two test sets are utilized: the first for individual BI-RADS assessment, containing 4322 ultrasound images; the second for overall BI-RADS assessment, containing 175 sets of ultrasound image pairs. In the individual BI-RADS assessment of margin, contour, echogenicity, calcification, and vascularity, IBRANet achieves accuracies of 0.9491, 0.9466, 0.9293, 0.9234, and 0.9625, respectively. In the overall BI-RADS assessment of lesions, the ER achieves an accuracy of 0.8502. Compared to independent diagnosis, the human-machine collaborative diagnosis results of three radiologists show increases in positive predictive value by 0.0158, 0.0427, and 0.0401, in sensitivity by 0.0400, 0.0600 and 0.0434, and in area under the curve by 0.0344, 0.0468, and 0.0255. This study proposes a DMSRM that enhances the transparency of the diagnostic reasoning process. Results indicate that DMSRM exhibits robust BI-RADS assessment capabilities and provides an interpretable reasoning process that better suits clinical needs.</abstract><cop>England</cop><pub>IOP Publishing</pub><pmid>39401527</pmid><doi>10.1088/1361-6560/ad869f</doi><tpages>28</tpages><orcidid>https://orcid.org/0009-0005-1770-8894</orcidid><orcidid>https://orcid.org/0009-0007-4202-574X</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0031-9155
ispartof Physics in medicine & biology, 2024-10, Vol.69 (21), p.215025
issn 0031-9155
1361-6560
1361-6560
language eng
recordid cdi_proquest_miscellaneous_3116675721
source MEDLINE; Institute of Physics Journals
subjects auxiliary diagnosis
BI-RADS assessment
Breast Neoplasms - diagnostic imaging
breast ultrasound
Deep Learning
Female
Humans
Image Interpretation, Computer-Assisted - methods
multi-stage reasoning
Ultrasonography - methods
Ultrasonography, Mammary - methods
title Interpretable diagnosis of breast lesions in ultrasound imaging using deep multi-stage reasoning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-10T22%3A49%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Interpretable%20diagnosis%20of%20breast%20lesions%20in%20ultrasound%20imaging%20using%20deep%20multi-stage%20reasoning&rft.jtitle=Physics%20in%20medicine%20&%20biology&rft.au=Cui,%20Kaixuan&rft.date=2024-10-24&rft.volume=69&rft.issue=21&rft.spage=215025&rft.pages=215025-&rft.issn=0031-9155&rft.eissn=1361-6560&rft.coden=PHMBA7&rft_id=info:doi/10.1088/1361-6560/ad869f&rft_dat=%3Cproquest_cross%3E3116675721%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3116675721&rft_id=info:pmid/39401527&rfr_iscdi=true