Hierarchical Convolutional Neural Networks for Segmentation of Breast Tumors in MRI With Application to Radiogenomics
Breast tumor segmentation based on dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a challenging problem and an active area of research. Particular challenges, similarly as in other segmentation problems, include the class-imbalance problem as well as confounding background in DCE-...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on medical imaging 2019-02, Vol.38 (2), p.435-447 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 447 |
---|---|
container_issue | 2 |
container_start_page | 435 |
container_title | IEEE transactions on medical imaging |
container_volume | 38 |
creator | Zhang, Jun Saha, Ashirbani Zhu, Zhe Mazurowski, Maciej A. |
description | Breast tumor segmentation based on dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a challenging problem and an active area of research. Particular challenges, similarly as in other segmentation problems, include the class-imbalance problem as well as confounding background in DCE-MR images. To address these issues, we propose a mask-guided hierarchical learning (MHL) framework for breast tumor segmentation via fully convolutional networks (FCN). Specifically, we first develop an FCN model to generate a 3D breast mask as the region of interest (ROI) for each image, to remove confounding information from input DCE-MR images. We then design a two-stage FCN model to perform coarse-to-fine segmentation for breast tumors. Particularly, we propose a Dice-Sensitivity-like loss function and a reinforcement sampling strategy to handle the class-imbalance problem. To precisely identify locations of tumors that underwent a biopsy, we further propose an FCN model to detect two landmarks located at two nipples. We finally selected the biopsied tumor based on both identified landmarks and segmentations. We validate our MHL method on 272 patients, achieving a mean Dice similarity coefficient (DSC) of 0.72 which is comparable to mutual DSC between expert radiologists. Using the segmented biopsied tumors, we also demonstrate that the automatically generated masks can be applied to radiogenomics and can identify luminal A subtype from other molecular subtypes with the similar accuracy with the analysis based on semi-manual tumor segmentation. |
doi_str_mv | 10.1109/TMI.2018.2865671 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TMI_2018_2865671</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8438531</ieee_id><sourcerecordid>2091822736</sourcerecordid><originalsourceid>FETCH-LOGICAL-c394t-a0d7c5c1ecf4bcf77b091f5b9ea273e16100c35b3ca65b2ce4b2d735260450f53</originalsourceid><addsrcrecordid>eNpdkd9LHDEQx4NU6lX7LggS6Etf9pwkm-zuox5t78BT0BN9W7K52TO6u7kmuy3978390IdCYAjzmS_MfAg5ZTBmDIqLxXw25sDyMc-VVBk7ICMmZZ5wmT59IiPgWZ4AKH5EvoTwAsBSCcVnciSAxZezERmmFr325tka3dCJ6_64Zuit6-LvBge_Lf1f518DrZ2n97hqsev1BqGuplcedejpYmidD9R2dH43o4-2f6aX63UTQ7dg7-idXlq3ws611oQTcljrJuDXfT0mDz9_LCbT5Pr212xyeZ0YUaR9omGZGWkYmjqtTJ1lFRSsllWBmmcCmWIARshKGK1kxQ2mFV9mQnIFcdFaimPyfZe79u73gKEvWxsMNo3u0A2h5DEv5zFLRfTbf-iLG3w8Q6RYJpVSUqaRgh1lvAvBY12uvW21_1cyKDdKyqik3Cgp90riyPk-eKhaXH4MvDuIwNkOsIj40c5TkUvBxBvEDJBV</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2175666554</pqid></control><display><type>article</type><title>Hierarchical Convolutional Neural Networks for Segmentation of Breast Tumors in MRI With Application to Radiogenomics</title><source>IEEE Electronic Library (IEL)</source><creator>Zhang, Jun ; Saha, Ashirbani ; Zhu, Zhe ; Mazurowski, Maciej A.</creator><creatorcontrib>Zhang, Jun ; Saha, Ashirbani ; Zhu, Zhe ; Mazurowski, Maciej A.</creatorcontrib><description>Breast tumor segmentation based on dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a challenging problem and an active area of research. Particular challenges, similarly as in other segmentation problems, include the class-imbalance problem as well as confounding background in DCE-MR images. To address these issues, we propose a mask-guided hierarchical learning (MHL) framework for breast tumor segmentation via fully convolutional networks (FCN). Specifically, we first develop an FCN model to generate a 3D breast mask as the region of interest (ROI) for each image, to remove confounding information from input DCE-MR images. We then design a two-stage FCN model to perform coarse-to-fine segmentation for breast tumors. Particularly, we propose a Dice-Sensitivity-like loss function and a reinforcement sampling strategy to handle the class-imbalance problem. To precisely identify locations of tumors that underwent a biopsy, we further propose an FCN model to detect two landmarks located at two nipples. We finally selected the biopsied tumor based on both identified landmarks and segmentations. We validate our MHL method on 272 patients, achieving a mean Dice similarity coefficient (DSC) of 0.72 which is comparable to mutual DSC between expert radiologists. Using the segmented biopsied tumors, we also demonstrate that the automatically generated masks can be applied to radiogenomics and can identify luminal A subtype from other molecular subtypes with the similar accuracy with the analysis based on semi-manual tumor segmentation.</description><identifier>ISSN: 0278-0062</identifier><identifier>EISSN: 1558-254X</identifier><identifier>DOI: 10.1109/TMI.2018.2865671</identifier><identifier>PMID: 30130181</identifier><identifier>CODEN: ITMID4</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Artificial neural networks ; Biopsy ; Breast ; Breast cancer ; Breast Neoplasms - classification ; Breast Neoplasms - diagnostic imaging ; Breast Neoplasms - genetics ; breast tumor ; Breast tumors ; Dynamic contrast-enhanced magnetic resonance imaging ; Feature extraction ; Female ; Genomics ; Humans ; Image Interpretation, Computer-Assisted - methods ; Image processing ; Image segmentation ; Lesions ; Magnetic resonance imaging ; Magnetic Resonance Imaging - methods ; Masks ; Medical imaging ; molecular subtype classification ; Neural networks ; Neural Networks, Computer ; Nipples ; NMR ; Nuclear magnetic resonance ; segmentation ; Three dimensional models ; Tumors</subject><ispartof>IEEE transactions on medical imaging, 2019-02, Vol.38 (2), p.435-447</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c394t-a0d7c5c1ecf4bcf77b091f5b9ea273e16100c35b3ca65b2ce4b2d735260450f53</citedby><cites>FETCH-LOGICAL-c394t-a0d7c5c1ecf4bcf77b091f5b9ea273e16100c35b3ca65b2ce4b2d735260450f53</cites><orcidid>0000-0001-7315-9547 ; 0000-0001-5579-7094</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8438531$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/8438531$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30130181$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Zhang, Jun</creatorcontrib><creatorcontrib>Saha, Ashirbani</creatorcontrib><creatorcontrib>Zhu, Zhe</creatorcontrib><creatorcontrib>Mazurowski, Maciej A.</creatorcontrib><title>Hierarchical Convolutional Neural Networks for Segmentation of Breast Tumors in MRI With Application to Radiogenomics</title><title>IEEE transactions on medical imaging</title><addtitle>TMI</addtitle><addtitle>IEEE Trans Med Imaging</addtitle><description>Breast tumor segmentation based on dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a challenging problem and an active area of research. Particular challenges, similarly as in other segmentation problems, include the class-imbalance problem as well as confounding background in DCE-MR images. To address these issues, we propose a mask-guided hierarchical learning (MHL) framework for breast tumor segmentation via fully convolutional networks (FCN). Specifically, we first develop an FCN model to generate a 3D breast mask as the region of interest (ROI) for each image, to remove confounding information from input DCE-MR images. We then design a two-stage FCN model to perform coarse-to-fine segmentation for breast tumors. Particularly, we propose a Dice-Sensitivity-like loss function and a reinforcement sampling strategy to handle the class-imbalance problem. To precisely identify locations of tumors that underwent a biopsy, we further propose an FCN model to detect two landmarks located at two nipples. We finally selected the biopsied tumor based on both identified landmarks and segmentations. We validate our MHL method on 272 patients, achieving a mean Dice similarity coefficient (DSC) of 0.72 which is comparable to mutual DSC between expert radiologists. Using the segmented biopsied tumors, we also demonstrate that the automatically generated masks can be applied to radiogenomics and can identify luminal A subtype from other molecular subtypes with the similar accuracy with the analysis based on semi-manual tumor segmentation.</description><subject>Artificial neural networks</subject><subject>Biopsy</subject><subject>Breast</subject><subject>Breast cancer</subject><subject>Breast Neoplasms - classification</subject><subject>Breast Neoplasms - diagnostic imaging</subject><subject>Breast Neoplasms - genetics</subject><subject>breast tumor</subject><subject>Breast tumors</subject><subject>Dynamic contrast-enhanced magnetic resonance imaging</subject><subject>Feature extraction</subject><subject>Female</subject><subject>Genomics</subject><subject>Humans</subject><subject>Image Interpretation, Computer-Assisted - methods</subject><subject>Image processing</subject><subject>Image segmentation</subject><subject>Lesions</subject><subject>Magnetic resonance imaging</subject><subject>Magnetic Resonance Imaging - methods</subject><subject>Masks</subject><subject>Medical imaging</subject><subject>molecular subtype classification</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Nipples</subject><subject>NMR</subject><subject>Nuclear magnetic resonance</subject><subject>segmentation</subject><subject>Three dimensional models</subject><subject>Tumors</subject><issn>0278-0062</issn><issn>1558-254X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNpdkd9LHDEQx4NU6lX7LggS6Etf9pwkm-zuox5t78BT0BN9W7K52TO6u7kmuy3978390IdCYAjzmS_MfAg5ZTBmDIqLxXw25sDyMc-VVBk7ICMmZZ5wmT59IiPgWZ4AKH5EvoTwAsBSCcVnciSAxZezERmmFr325tka3dCJ6_64Zuit6-LvBge_Lf1f518DrZ2n97hqsev1BqGuplcedejpYmidD9R2dH43o4-2f6aX63UTQ7dg7-idXlq3ws611oQTcljrJuDXfT0mDz9_LCbT5Pr212xyeZ0YUaR9omGZGWkYmjqtTJ1lFRSsllWBmmcCmWIARshKGK1kxQ2mFV9mQnIFcdFaimPyfZe79u73gKEvWxsMNo3u0A2h5DEv5zFLRfTbf-iLG3w8Q6RYJpVSUqaRgh1lvAvBY12uvW21_1cyKDdKyqik3Cgp90riyPk-eKhaXH4MvDuIwNkOsIj40c5TkUvBxBvEDJBV</recordid><startdate>20190201</startdate><enddate>20190201</enddate><creator>Zhang, Jun</creator><creator>Saha, Ashirbani</creator><creator>Zhu, Zhe</creator><creator>Mazurowski, Maciej A.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>NAPCQ</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-7315-9547</orcidid><orcidid>https://orcid.org/0000-0001-5579-7094</orcidid></search><sort><creationdate>20190201</creationdate><title>Hierarchical Convolutional Neural Networks for Segmentation of Breast Tumors in MRI With Application to Radiogenomics</title><author>Zhang, Jun ; Saha, Ashirbani ; Zhu, Zhe ; Mazurowski, Maciej A.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c394t-a0d7c5c1ecf4bcf77b091f5b9ea273e16100c35b3ca65b2ce4b2d735260450f53</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Artificial neural networks</topic><topic>Biopsy</topic><topic>Breast</topic><topic>Breast cancer</topic><topic>Breast Neoplasms - classification</topic><topic>Breast Neoplasms - diagnostic imaging</topic><topic>Breast Neoplasms - genetics</topic><topic>breast tumor</topic><topic>Breast tumors</topic><topic>Dynamic contrast-enhanced magnetic resonance imaging</topic><topic>Feature extraction</topic><topic>Female</topic><topic>Genomics</topic><topic>Humans</topic><topic>Image Interpretation, Computer-Assisted - methods</topic><topic>Image processing</topic><topic>Image segmentation</topic><topic>Lesions</topic><topic>Magnetic resonance imaging</topic><topic>Magnetic Resonance Imaging - methods</topic><topic>Masks</topic><topic>Medical imaging</topic><topic>molecular subtype classification</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Nipples</topic><topic>NMR</topic><topic>Nuclear magnetic resonance</topic><topic>segmentation</topic><topic>Three dimensional models</topic><topic>Tumors</topic><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Jun</creatorcontrib><creatorcontrib>Saha, Ashirbani</creatorcontrib><creatorcontrib>Zhu, Zhe</creatorcontrib><creatorcontrib>Mazurowski, Maciej A.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Nursing & Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on medical imaging</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zhang, Jun</au><au>Saha, Ashirbani</au><au>Zhu, Zhe</au><au>Mazurowski, Maciej A.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Hierarchical Convolutional Neural Networks for Segmentation of Breast Tumors in MRI With Application to Radiogenomics</atitle><jtitle>IEEE transactions on medical imaging</jtitle><stitle>TMI</stitle><addtitle>IEEE Trans Med Imaging</addtitle><date>2019-02-01</date><risdate>2019</risdate><volume>38</volume><issue>2</issue><spage>435</spage><epage>447</epage><pages>435-447</pages><issn>0278-0062</issn><eissn>1558-254X</eissn><coden>ITMID4</coden><abstract>Breast tumor segmentation based on dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a challenging problem and an active area of research. Particular challenges, similarly as in other segmentation problems, include the class-imbalance problem as well as confounding background in DCE-MR images. To address these issues, we propose a mask-guided hierarchical learning (MHL) framework for breast tumor segmentation via fully convolutional networks (FCN). Specifically, we first develop an FCN model to generate a 3D breast mask as the region of interest (ROI) for each image, to remove confounding information from input DCE-MR images. We then design a two-stage FCN model to perform coarse-to-fine segmentation for breast tumors. Particularly, we propose a Dice-Sensitivity-like loss function and a reinforcement sampling strategy to handle the class-imbalance problem. To precisely identify locations of tumors that underwent a biopsy, we further propose an FCN model to detect two landmarks located at two nipples. We finally selected the biopsied tumor based on both identified landmarks and segmentations. We validate our MHL method on 272 patients, achieving a mean Dice similarity coefficient (DSC) of 0.72 which is comparable to mutual DSC between expert radiologists. Using the segmented biopsied tumors, we also demonstrate that the automatically generated masks can be applied to radiogenomics and can identify luminal A subtype from other molecular subtypes with the similar accuracy with the analysis based on semi-manual tumor segmentation.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>30130181</pmid><doi>10.1109/TMI.2018.2865671</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0001-7315-9547</orcidid><orcidid>https://orcid.org/0000-0001-5579-7094</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 0278-0062 |
ispartof | IEEE transactions on medical imaging, 2019-02, Vol.38 (2), p.435-447 |
issn | 0278-0062 1558-254X |
language | eng |
recordid | cdi_crossref_primary_10_1109_TMI_2018_2865671 |
source | IEEE Electronic Library (IEL) |
subjects | Artificial neural networks Biopsy Breast Breast cancer Breast Neoplasms - classification Breast Neoplasms - diagnostic imaging Breast Neoplasms - genetics breast tumor Breast tumors Dynamic contrast-enhanced magnetic resonance imaging Feature extraction Female Genomics Humans Image Interpretation, Computer-Assisted - methods Image processing Image segmentation Lesions Magnetic resonance imaging Magnetic Resonance Imaging - methods Masks Medical imaging molecular subtype classification Neural networks Neural Networks, Computer Nipples NMR Nuclear magnetic resonance segmentation Three dimensional models Tumors |
title | Hierarchical Convolutional Neural Networks for Segmentation of Breast Tumors in MRI With Application to Radiogenomics |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-21T20%3A06%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Hierarchical%20Convolutional%20Neural%20Networks%20for%20Segmentation%20of%20Breast%20Tumors%20in%20MRI%20With%20Application%20to%20Radiogenomics&rft.jtitle=IEEE%20transactions%20on%20medical%20imaging&rft.au=Zhang,%20Jun&rft.date=2019-02-01&rft.volume=38&rft.issue=2&rft.spage=435&rft.epage=447&rft.pages=435-447&rft.issn=0278-0062&rft.eissn=1558-254X&rft.coden=ITMID4&rft_id=info:doi/10.1109/TMI.2018.2865671&rft_dat=%3Cproquest_RIE%3E2091822736%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2175666554&rft_id=info:pmid/30130181&rft_ieee_id=8438531&rfr_iscdi=true |