Breast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Network

Some recent studies have described deep convolutional neural networks to diagnose breast cancer in mammograms with similar or even superior performance to that of human experts. One of the best techniques does two transfer learnings: the first uses a model trained on natural images to create a "...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, Vol.10, p.77723-77731
Hauptverfasser: Petrini, Daniel G. P., Shimizu, Carlos, Roela, Rosimeire A., Valente, Gabriel Vansuita, Folgueira, Maria Aparecida Azevedo Koike, Kim, Hae Yong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 77731
container_issue
container_start_page 77723
container_title IEEE access
container_volume 10
creator Petrini, Daniel G. P.
Shimizu, Carlos
Roela, Rosimeire A.
Valente, Gabriel Vansuita
Folgueira, Maria Aparecida Azevedo Koike
Kim, Hae Yong
description Some recent studies have described deep convolutional neural networks to diagnose breast cancer in mammograms with similar or even superior performance to that of human experts. One of the best techniques does two transfer learnings: the first uses a model trained on natural images to create a "patch classifier" that categorizes small subimages; the second uses the patch classifier to scan the whole mammogram and create the "single-view whole-image classifier". We propose to make a third transfer learning to obtain a "two-view classifier" to use the two mammographic views: bilateral craniocaudal and mediolateral oblique. We use EfficientNet as the basis of our model. We "end-to-end" train the entire system using CBIS-DDSM dataset. To ensure statistical robustness, we test our system twice using: (a) 5-fold cross validation; and (b) the original training/test division of the dataset. Our technique reached an AUC of 0.9344 using 5-fold cross validation (accuracy, sensitivity and specificity are 85.13% at the equal error rate point of ROC). Using the original dataset division, our technique achieved an AUC of 0.8483, as far as we know the highest reported AUC for this problem, although the subtle differences in the testing conditions of each work do not allow for an accurate comparison. The inference code and model are available at https://github.com/dpetrini/two-views-classifier
doi_str_mv 10.1109/ACCESS.2022.3193250
format Article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_9837037</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9837037</ieee_id><doaj_id>oai_doaj_org_article_fe7c890241a446c2a9d2810ee62667b7</doaj_id><sourcerecordid>2696284433</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-db0d71e4aa302e2b35537c40c270428db9100bb2253f3f912e6d1769cdf522f83</originalsourceid><addsrcrecordid>eNpNUcFO3DAQjVCRiihfwMVSz9na48SOj5BuWyRaDiy9Wk48XrzdtRc7y4q_r2kQ6lze6M28Nxq9qrpkdMEYVV-u-n55f78ACrDgTHFo6Ul1BkyomrdcfPiv_1hd5LyhpbpCtfKserpOaPJEehNGTOSrN-sQs8_EB7I6xvq3xyP5aXa7uE5m__hCHrIPa7IMtp5iXYCskvEBLVk650ePYfqFU31tcqH6GJ7j9jD5GMyWFP4Y059P1akz24wXb3hePXxbrvof9e3d95v-6rYeG9pNtR2olQwbYzgFhIG3LZdlNIKkDXR2UIzSYQBoueNOMUBhmRRqtK4FcB0_r25mXxvNRu-T35n0oqPx-h8R01qbNPlxi9qhHDtFoWGmacQIRlnoGEUUIIQcZPH6PHvtU3w6YJ70Jh5SeSprEEpA1zScly0-b40p5pzQvV9lVL9Gpeeo9GtU-i2qorqcVR4R3xWq45Jyyf8CjLCOmA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2696284433</pqid></control><display><type>article</type><title>Breast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Network</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Petrini, Daniel G. P. ; Shimizu, Carlos ; Roela, Rosimeire A. ; Valente, Gabriel Vansuita ; Folgueira, Maria Aparecida Azevedo Koike ; Kim, Hae Yong</creator><creatorcontrib>Petrini, Daniel G. P. ; Shimizu, Carlos ; Roela, Rosimeire A. ; Valente, Gabriel Vansuita ; Folgueira, Maria Aparecida Azevedo Koike ; Kim, Hae Yong</creatorcontrib><description>Some recent studies have described deep convolutional neural networks to diagnose breast cancer in mammograms with similar or even superior performance to that of human experts. One of the best techniques does two transfer learnings: the first uses a model trained on natural images to create a "patch classifier" that categorizes small subimages; the second uses the patch classifier to scan the whole mammogram and create the "single-view whole-image classifier". We propose to make a third transfer learning to obtain a "two-view classifier" to use the two mammographic views: bilateral craniocaudal and mediolateral oblique. We use EfficientNet as the basis of our model. We "end-to-end" train the entire system using CBIS-DDSM dataset. To ensure statistical robustness, we test our system twice using: (a) 5-fold cross validation; and (b) the original training/test division of the dataset. Our technique reached an AUC of 0.9344 using 5-fold cross validation (accuracy, sensitivity and specificity are 85.13% at the equal error rate point of ROC). Using the original dataset division, our technique achieved an AUC of 0.8483, as far as we know the highest reported AUC for this problem, although the subtle differences in the testing conditions of each work do not allow for an accurate comparison. The inference code and model are available at https://github.com/dpetrini/two-views-classifier</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2022.3193250</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Artificial intelligence ; Artificial neural networks ; Breast cancer ; Breast cancer diagnosis ; Classifiers ; convolutional neural network ; Convolutional neural networks ; Datasets ; deep learning ; Lesions ; mammogram ; Mammography ; Medical imaging ; Training ; Transfer learning</subject><ispartof>IEEE access, 2022, Vol.10, p.77723-77731</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-db0d71e4aa302e2b35537c40c270428db9100bb2253f3f912e6d1769cdf522f83</citedby><cites>FETCH-LOGICAL-c408t-db0d71e4aa302e2b35537c40c270428db9100bb2253f3f912e6d1769cdf522f83</cites><orcidid>0000-0002-7278-9632 ; 0000-0002-0958-5960</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9837037$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2102,4024,27633,27923,27924,27925,54933</link.rule.ids></links><search><creatorcontrib>Petrini, Daniel G. P.</creatorcontrib><creatorcontrib>Shimizu, Carlos</creatorcontrib><creatorcontrib>Roela, Rosimeire A.</creatorcontrib><creatorcontrib>Valente, Gabriel Vansuita</creatorcontrib><creatorcontrib>Folgueira, Maria Aparecida Azevedo Koike</creatorcontrib><creatorcontrib>Kim, Hae Yong</creatorcontrib><title>Breast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Network</title><title>IEEE access</title><addtitle>Access</addtitle><description>Some recent studies have described deep convolutional neural networks to diagnose breast cancer in mammograms with similar or even superior performance to that of human experts. One of the best techniques does two transfer learnings: the first uses a model trained on natural images to create a "patch classifier" that categorizes small subimages; the second uses the patch classifier to scan the whole mammogram and create the "single-view whole-image classifier". We propose to make a third transfer learning to obtain a "two-view classifier" to use the two mammographic views: bilateral craniocaudal and mediolateral oblique. We use EfficientNet as the basis of our model. We "end-to-end" train the entire system using CBIS-DDSM dataset. To ensure statistical robustness, we test our system twice using: (a) 5-fold cross validation; and (b) the original training/test division of the dataset. Our technique reached an AUC of 0.9344 using 5-fold cross validation (accuracy, sensitivity and specificity are 85.13% at the equal error rate point of ROC). Using the original dataset division, our technique achieved an AUC of 0.8483, as far as we know the highest reported AUC for this problem, although the subtle differences in the testing conditions of each work do not allow for an accurate comparison. The inference code and model are available at https://github.com/dpetrini/two-views-classifier</description><subject>Artificial intelligence</subject><subject>Artificial neural networks</subject><subject>Breast cancer</subject><subject>Breast cancer diagnosis</subject><subject>Classifiers</subject><subject>convolutional neural network</subject><subject>Convolutional neural networks</subject><subject>Datasets</subject><subject>deep learning</subject><subject>Lesions</subject><subject>mammogram</subject><subject>Mammography</subject><subject>Medical imaging</subject><subject>Training</subject><subject>Transfer learning</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUcFO3DAQjVCRiihfwMVSz9na48SOj5BuWyRaDiy9Wk48XrzdtRc7y4q_r2kQ6lze6M28Nxq9qrpkdMEYVV-u-n55f78ACrDgTHFo6Ul1BkyomrdcfPiv_1hd5LyhpbpCtfKserpOaPJEehNGTOSrN-sQs8_EB7I6xvq3xyP5aXa7uE5m__hCHrIPa7IMtp5iXYCskvEBLVk650ePYfqFU31tcqH6GJ7j9jD5GMyWFP4Y059P1akz24wXb3hePXxbrvof9e3d95v-6rYeG9pNtR2olQwbYzgFhIG3LZdlNIKkDXR2UIzSYQBoueNOMUBhmRRqtK4FcB0_r25mXxvNRu-T35n0oqPx-h8R01qbNPlxi9qhHDtFoWGmacQIRlnoGEUUIIQcZPH6PHvtU3w6YJ70Jh5SeSprEEpA1zScly0-b40p5pzQvV9lVL9Gpeeo9GtU-i2qorqcVR4R3xWq45Jyyf8CjLCOmA</recordid><startdate>2022</startdate><enddate>2022</enddate><creator>Petrini, Daniel G. P.</creator><creator>Shimizu, Carlos</creator><creator>Roela, Rosimeire A.</creator><creator>Valente, Gabriel Vansuita</creator><creator>Folgueira, Maria Aparecida Azevedo Koike</creator><creator>Kim, Hae Yong</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-7278-9632</orcidid><orcidid>https://orcid.org/0000-0002-0958-5960</orcidid></search><sort><creationdate>2022</creationdate><title>Breast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Network</title><author>Petrini, Daniel G. P. ; Shimizu, Carlos ; Roela, Rosimeire A. ; Valente, Gabriel Vansuita ; Folgueira, Maria Aparecida Azevedo Koike ; Kim, Hae Yong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-db0d71e4aa302e2b35537c40c270428db9100bb2253f3f912e6d1769cdf522f83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Artificial intelligence</topic><topic>Artificial neural networks</topic><topic>Breast cancer</topic><topic>Breast cancer diagnosis</topic><topic>Classifiers</topic><topic>convolutional neural network</topic><topic>Convolutional neural networks</topic><topic>Datasets</topic><topic>deep learning</topic><topic>Lesions</topic><topic>mammogram</topic><topic>Mammography</topic><topic>Medical imaging</topic><topic>Training</topic><topic>Transfer learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Petrini, Daniel G. P.</creatorcontrib><creatorcontrib>Shimizu, Carlos</creatorcontrib><creatorcontrib>Roela, Rosimeire A.</creatorcontrib><creatorcontrib>Valente, Gabriel Vansuita</creatorcontrib><creatorcontrib>Folgueira, Maria Aparecida Azevedo Koike</creatorcontrib><creatorcontrib>Kim, Hae Yong</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Petrini, Daniel G. P.</au><au>Shimizu, Carlos</au><au>Roela, Rosimeire A.</au><au>Valente, Gabriel Vansuita</au><au>Folgueira, Maria Aparecida Azevedo Koike</au><au>Kim, Hae Yong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Breast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Network</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2022</date><risdate>2022</risdate><volume>10</volume><spage>77723</spage><epage>77731</epage><pages>77723-77731</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Some recent studies have described deep convolutional neural networks to diagnose breast cancer in mammograms with similar or even superior performance to that of human experts. One of the best techniques does two transfer learnings: the first uses a model trained on natural images to create a "patch classifier" that categorizes small subimages; the second uses the patch classifier to scan the whole mammogram and create the "single-view whole-image classifier". We propose to make a third transfer learning to obtain a "two-view classifier" to use the two mammographic views: bilateral craniocaudal and mediolateral oblique. We use EfficientNet as the basis of our model. We "end-to-end" train the entire system using CBIS-DDSM dataset. To ensure statistical robustness, we test our system twice using: (a) 5-fold cross validation; and (b) the original training/test division of the dataset. Our technique reached an AUC of 0.9344 using 5-fold cross validation (accuracy, sensitivity and specificity are 85.13% at the equal error rate point of ROC). Using the original dataset division, our technique achieved an AUC of 0.8483, as far as we know the highest reported AUC for this problem, although the subtle differences in the testing conditions of each work do not allow for an accurate comparison. The inference code and model are available at https://github.com/dpetrini/two-views-classifier</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2022.3193250</doi><tpages>9</tpages><orcidid>https://orcid.org/0000-0002-7278-9632</orcidid><orcidid>https://orcid.org/0000-0002-0958-5960</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2022, Vol.10, p.77723-77731
issn 2169-3536
2169-3536
language eng
recordid cdi_ieee_primary_9837037
source IEEE Open Access Journals; DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals
subjects Artificial intelligence
Artificial neural networks
Breast cancer
Breast cancer diagnosis
Classifiers
convolutional neural network
Convolutional neural networks
Datasets
deep learning
Lesions
mammogram
Mammography
Medical imaging
Training
Transfer learning
title Breast Cancer Diagnosis in Two-View Mammography Using End-to-End Trained EfficientNet-Based Convolutional Network
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-30T18%3A36%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Breast%20Cancer%20Diagnosis%20in%20Two-View%20Mammography%20Using%20End-to-End%20Trained%20EfficientNet-Based%20Convolutional%20Network&rft.jtitle=IEEE%20access&rft.au=Petrini,%20Daniel%20G.%20P.&rft.date=2022&rft.volume=10&rft.spage=77723&rft.epage=77731&rft.pages=77723-77731&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2022.3193250&rft_dat=%3Cproquest_ieee_%3E2696284433%3C/proquest_ieee_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2696284433&rft_id=info:pmid/&rft_ieee_id=9837037&rft_doaj_id=oai_doaj_org_article_fe7c890241a446c2a9d2810ee62667b7&rfr_iscdi=true