Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets
Funding We would like to acknowledge eurekaSD: Enhancing University Research and Education in Areas Useful for Sustainable Development - grants EK14AC0037 and EK15AC0264. We thank Araucária Foundation for the Support of the Scientific and Technological Development of Paraná through a Research and Te...
Gespeichert in:
Veröffentlicht in: | Multimedia tools and applications 2024-03, Vol.83 (9), p.27305-27329 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 27329 |
---|---|
container_issue | 9 |
container_start_page | 27305 |
container_title | Multimedia tools and applications |
container_volume | 83 |
creator | Spolaôr, Newton Lee, Huei Diana Mendes, Ana Isabel Nogueira, Conceição Parmezan, Antonio Rafael Sabino Takaki, Weber Shoity Resende Coy, Claudio Saddy Rodrigues Wu, Feng Chung Fonseca-Pinto, Rui |
description | Funding We would like to acknowledge eurekaSD: Enhancing University Research and Education in Areas Useful for Sustainable Development - grants EK14AC0037 and EK15AC0264. We thank Araucária Foundation for the Support of the Scientific and Technological Development of Paraná through a Research and Technological Productivity Scholarship for H. D. Lee (grant 028/2019). We also thank the Brazilian National Council for Scientific and Technological Development (CNPq) through the grant number 142050/2019-9 for A. R. S. Parmezan. The Portuguese team was partially supported by Fundação para a Ciência e a Tecnologia (FCT). R. Fonseca-Pinto was financed by the projects UIDB/50008/2020, UIDP/50008/2020, UIDB/05704/2020 and UIDP/05704/2020 and C. V. Nogueira was financed by the projects UIDB/00013/2020 and UIDP/00013/2020. The funding agencies did not have any further involvement in this paper.
Convolutional neural networks have been effective in several applications, arising as a promising supporting tool in a relevant Dermatology problem: skin cancer diagnosis. However, generalizing well can be difficult when little training data is available. The fine-tuning transfer learning strategy has been employed to differentiate properly malignant from non-malignant lesions in dermoscopic images. Fine-tuning a pre-trained network allows one to classify data in the target domain, occasionally with few images, using knowledge acquired in another domain. This work proposes eight fine-tuning settings based on convolutional networks previously trained on ImageNet that can be employed mainly in limited data samples to reduce overfitting risk. They differ on the architecture, the learning rate and the number of unfrozen layer blocks. We evaluated the settings in two public datasets with 104 and 200 dermoscopic images. By finding competitive configurations in small datasets, this paper illustrates that deep learning can be effective if one has only a few dozen malignant and non-malignant lesion images to study and differentiate in Dermatology. The proposal is also flexible and potentially useful for other domains. In fact, it performed satisfactorily in an assessment conducted in a larger dataset with 746 computerized tomographic images associated with the coronavirus disease. |
doi_str_mv | 10.1007/s11042-023-16529-w |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2933269661</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2933269661</sourcerecordid><originalsourceid>FETCH-LOGICAL-c346t-ba2703c836314ecf704383250a7eaaa87e8247d8b1c7560cd634d9b972592f273</originalsourceid><addsrcrecordid>eNp9kMtOwzAQRS0EEuXxA6wiWBv8ih0vUUUBqRIbWFuu41QuqRM8CRV_j9sgwYrVjK7OHcsHoStKbikh6g4oJYJhwjimsmQa747QjJaKY6UYPf6zn6IzgA0he0zMkFuE6PEwxhDXRZ_ymmxO6iL6Mdk2j2HXpXcomi4VW18Hl8OwtWtfuNYChCYnQ-hiEWIBW9u2OQ_xgNV2sOAHuEAnjW3BX_7Mc_S2eHidP-Hly-Pz_H6JHRdywCvLFOGu4pJT4V2jiOAVZyWxyltrK-UrJlRdrahTpSSullzUeqUVKzVrmOLn6Ga626fuY_QwmE03ppifNExzzqSWkmaKTZRLHUDyjelT_lD6MpSYvUwzyTRZpjnINLtc4lMJMhzXPv2e_rd1PbWSs7Y3yX8GyEoyLwgxldFUC_4NByWDZw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2933269661</pqid></control><display><type>article</type><title>Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets</title><source>SpringerLink Journals - AutoHoldings</source><creator>Spolaôr, Newton ; Lee, Huei Diana ; Mendes, Ana Isabel ; Nogueira, Conceição ; Parmezan, Antonio Rafael Sabino ; Takaki, Weber Shoity Resende ; Coy, Claudio Saddy Rodrigues ; Wu, Feng Chung ; Fonseca-Pinto, Rui</creator><creatorcontrib>Spolaôr, Newton ; Lee, Huei Diana ; Mendes, Ana Isabel ; Nogueira, Conceição ; Parmezan, Antonio Rafael Sabino ; Takaki, Weber Shoity Resende ; Coy, Claudio Saddy Rodrigues ; Wu, Feng Chung ; Fonseca-Pinto, Rui</creatorcontrib><description>Funding We would like to acknowledge eurekaSD: Enhancing University Research and Education in Areas Useful for Sustainable Development - grants EK14AC0037 and EK15AC0264. We thank Araucária Foundation for the Support of the Scientific and Technological Development of Paraná through a Research and Technological Productivity Scholarship for H. D. Lee (grant 028/2019). We also thank the Brazilian National Council for Scientific and Technological Development (CNPq) through the grant number 142050/2019-9 for A. R. S. Parmezan. The Portuguese team was partially supported by Fundação para a Ciência e a Tecnologia (FCT). R. Fonseca-Pinto was financed by the projects UIDB/50008/2020, UIDP/50008/2020, UIDB/05704/2020 and UIDP/05704/2020 and C. V. Nogueira was financed by the projects UIDB/00013/2020 and UIDP/00013/2020. The funding agencies did not have any further involvement in this paper.
Convolutional neural networks have been effective in several applications, arising as a promising supporting tool in a relevant Dermatology problem: skin cancer diagnosis. However, generalizing well can be difficult when little training data is available. The fine-tuning transfer learning strategy has been employed to differentiate properly malignant from non-malignant lesions in dermoscopic images. Fine-tuning a pre-trained network allows one to classify data in the target domain, occasionally with few images, using knowledge acquired in another domain. This work proposes eight fine-tuning settings based on convolutional networks previously trained on ImageNet that can be employed mainly in limited data samples to reduce overfitting risk. They differ on the architecture, the learning rate and the number of unfrozen layer blocks. We evaluated the settings in two public datasets with 104 and 200 dermoscopic images. By finding competitive configurations in small datasets, this paper illustrates that deep learning can be effective if one has only a few dozen malignant and non-malignant lesion images to study and differentiate in Dermatology. The proposal is also flexible and potentially useful for other domains. In fact, it performed satisfactorily in an assessment conducted in a larger dataset with 746 computerized tomographic images associated with the coronavirus disease.</description><identifier>ISSN: 1573-7721</identifier><identifier>ISSN: 1380-7501</identifier><identifier>EISSN: 1573-7721</identifier><identifier>DOI: 10.1007/s11042-023-16529-w</identifier><language>eng</language><publisher>New York: Springer</publisher><subject>Artificial neural networks ; Computer Communication Networks ; Computer Science ; Data Structures and Information Theory ; Datasets ; Deep learning ; Dermatology ; Feature learning ; Image acquisition ; Image classification ; Knowledge acquisition ; Lesions ; Machine learning ; Medical imaging ; Multimedia Information Systems ; Neural networks ; RMSprop ; Shallow learning ; Skin cancer ; Special Purpose and Application-Based Systems ; Statistical test ; Track 2: Medical Applications of Multimedia ; VGG</subject><ispartof>Multimedia tools and applications, 2024-03, Vol.83 (9), p.27305-27329</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c346t-ba2703c836314ecf704383250a7eaaa87e8247d8b1c7560cd634d9b972592f273</citedby><cites>FETCH-LOGICAL-c346t-ba2703c836314ecf704383250a7eaaa87e8247d8b1c7560cd634d9b972592f273</cites><orcidid>0000-0001-6774-5363 ; 0000-0002-9269-2221 ; 0000-0003-0748-3693</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11042-023-16529-w$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11042-023-16529-w$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Spolaôr, Newton</creatorcontrib><creatorcontrib>Lee, Huei Diana</creatorcontrib><creatorcontrib>Mendes, Ana Isabel</creatorcontrib><creatorcontrib>Nogueira, Conceição</creatorcontrib><creatorcontrib>Parmezan, Antonio Rafael Sabino</creatorcontrib><creatorcontrib>Takaki, Weber Shoity Resende</creatorcontrib><creatorcontrib>Coy, Claudio Saddy Rodrigues</creatorcontrib><creatorcontrib>Wu, Feng Chung</creatorcontrib><creatorcontrib>Fonseca-Pinto, Rui</creatorcontrib><title>Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets</title><title>Multimedia tools and applications</title><addtitle>Multimed Tools Appl</addtitle><description>Funding We would like to acknowledge eurekaSD: Enhancing University Research and Education in Areas Useful for Sustainable Development - grants EK14AC0037 and EK15AC0264. We thank Araucária Foundation for the Support of the Scientific and Technological Development of Paraná through a Research and Technological Productivity Scholarship for H. D. Lee (grant 028/2019). We also thank the Brazilian National Council for Scientific and Technological Development (CNPq) through the grant number 142050/2019-9 for A. R. S. Parmezan. The Portuguese team was partially supported by Fundação para a Ciência e a Tecnologia (FCT). R. Fonseca-Pinto was financed by the projects UIDB/50008/2020, UIDP/50008/2020, UIDB/05704/2020 and UIDP/05704/2020 and C. V. Nogueira was financed by the projects UIDB/00013/2020 and UIDP/00013/2020. The funding agencies did not have any further involvement in this paper.
Convolutional neural networks have been effective in several applications, arising as a promising supporting tool in a relevant Dermatology problem: skin cancer diagnosis. However, generalizing well can be difficult when little training data is available. The fine-tuning transfer learning strategy has been employed to differentiate properly malignant from non-malignant lesions in dermoscopic images. Fine-tuning a pre-trained network allows one to classify data in the target domain, occasionally with few images, using knowledge acquired in another domain. This work proposes eight fine-tuning settings based on convolutional networks previously trained on ImageNet that can be employed mainly in limited data samples to reduce overfitting risk. They differ on the architecture, the learning rate and the number of unfrozen layer blocks. We evaluated the settings in two public datasets with 104 and 200 dermoscopic images. By finding competitive configurations in small datasets, this paper illustrates that deep learning can be effective if one has only a few dozen malignant and non-malignant lesion images to study and differentiate in Dermatology. The proposal is also flexible and potentially useful for other domains. In fact, it performed satisfactorily in an assessment conducted in a larger dataset with 746 computerized tomographic images associated with the coronavirus disease.</description><subject>Artificial neural networks</subject><subject>Computer Communication Networks</subject><subject>Computer Science</subject><subject>Data Structures and Information Theory</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Dermatology</subject><subject>Feature learning</subject><subject>Image acquisition</subject><subject>Image classification</subject><subject>Knowledge acquisition</subject><subject>Lesions</subject><subject>Machine learning</subject><subject>Medical imaging</subject><subject>Multimedia Information Systems</subject><subject>Neural networks</subject><subject>RMSprop</subject><subject>Shallow learning</subject><subject>Skin cancer</subject><subject>Special Purpose and Application-Based Systems</subject><subject>Statistical test</subject><subject>Track 2: Medical Applications of Multimedia</subject><subject>VGG</subject><issn>1573-7721</issn><issn>1380-7501</issn><issn>1573-7721</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp9kMtOwzAQRS0EEuXxA6wiWBv8ih0vUUUBqRIbWFuu41QuqRM8CRV_j9sgwYrVjK7OHcsHoStKbikh6g4oJYJhwjimsmQa747QjJaKY6UYPf6zn6IzgA0he0zMkFuE6PEwxhDXRZ_ymmxO6iL6Mdk2j2HXpXcomi4VW18Hl8OwtWtfuNYChCYnQ-hiEWIBW9u2OQ_xgNV2sOAHuEAnjW3BX_7Mc_S2eHidP-Hly-Pz_H6JHRdywCvLFOGu4pJT4V2jiOAVZyWxyltrK-UrJlRdrahTpSSullzUeqUVKzVrmOLn6Ga626fuY_QwmE03ppifNExzzqSWkmaKTZRLHUDyjelT_lD6MpSYvUwzyTRZpjnINLtc4lMJMhzXPv2e_rd1PbWSs7Y3yX8GyEoyLwgxldFUC_4NByWDZw</recordid><startdate>20240301</startdate><enddate>20240301</enddate><creator>Spolaôr, Newton</creator><creator>Lee, Huei Diana</creator><creator>Mendes, Ana Isabel</creator><creator>Nogueira, Conceição</creator><creator>Parmezan, Antonio Rafael Sabino</creator><creator>Takaki, Weber Shoity Resende</creator><creator>Coy, Claudio Saddy Rodrigues</creator><creator>Wu, Feng Chung</creator><creator>Fonseca-Pinto, Rui</creator><general>Springer</general><general>Springer US</general><general>Springer Nature B.V</general><scope>RCLKO</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-6774-5363</orcidid><orcidid>https://orcid.org/0000-0002-9269-2221</orcidid><orcidid>https://orcid.org/0000-0003-0748-3693</orcidid></search><sort><creationdate>20240301</creationdate><title>Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets</title><author>Spolaôr, Newton ; Lee, Huei Diana ; Mendes, Ana Isabel ; Nogueira, Conceição ; Parmezan, Antonio Rafael Sabino ; Takaki, Weber Shoity Resende ; Coy, Claudio Saddy Rodrigues ; Wu, Feng Chung ; Fonseca-Pinto, Rui</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c346t-ba2703c836314ecf704383250a7eaaa87e8247d8b1c7560cd634d9b972592f273</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Artificial neural networks</topic><topic>Computer Communication Networks</topic><topic>Computer Science</topic><topic>Data Structures and Information Theory</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Dermatology</topic><topic>Feature learning</topic><topic>Image acquisition</topic><topic>Image classification</topic><topic>Knowledge acquisition</topic><topic>Lesions</topic><topic>Machine learning</topic><topic>Medical imaging</topic><topic>Multimedia Information Systems</topic><topic>Neural networks</topic><topic>RMSprop</topic><topic>Shallow learning</topic><topic>Skin cancer</topic><topic>Special Purpose and Application-Based Systems</topic><topic>Statistical test</topic><topic>Track 2: Medical Applications of Multimedia</topic><topic>VGG</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Spolaôr, Newton</creatorcontrib><creatorcontrib>Lee, Huei Diana</creatorcontrib><creatorcontrib>Mendes, Ana Isabel</creatorcontrib><creatorcontrib>Nogueira, Conceição</creatorcontrib><creatorcontrib>Parmezan, Antonio Rafael Sabino</creatorcontrib><creatorcontrib>Takaki, Weber Shoity Resende</creatorcontrib><creatorcontrib>Coy, Claudio Saddy Rodrigues</creatorcontrib><creatorcontrib>Wu, Feng Chung</creatorcontrib><creatorcontrib>Fonseca-Pinto, Rui</creatorcontrib><collection>RCAAP open access repository</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Multimedia tools and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Spolaôr, Newton</au><au>Lee, Huei Diana</au><au>Mendes, Ana Isabel</au><au>Nogueira, Conceição</au><au>Parmezan, Antonio Rafael Sabino</au><au>Takaki, Weber Shoity Resende</au><au>Coy, Claudio Saddy Rodrigues</au><au>Wu, Feng Chung</au><au>Fonseca-Pinto, Rui</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets</atitle><jtitle>Multimedia tools and applications</jtitle><stitle>Multimed Tools Appl</stitle><date>2024-03-01</date><risdate>2024</risdate><volume>83</volume><issue>9</issue><spage>27305</spage><epage>27329</epage><pages>27305-27329</pages><issn>1573-7721</issn><issn>1380-7501</issn><eissn>1573-7721</eissn><abstract>Funding We would like to acknowledge eurekaSD: Enhancing University Research and Education in Areas Useful for Sustainable Development - grants EK14AC0037 and EK15AC0264. We thank Araucária Foundation for the Support of the Scientific and Technological Development of Paraná through a Research and Technological Productivity Scholarship for H. D. Lee (grant 028/2019). We also thank the Brazilian National Council for Scientific and Technological Development (CNPq) through the grant number 142050/2019-9 for A. R. S. Parmezan. The Portuguese team was partially supported by Fundação para a Ciência e a Tecnologia (FCT). R. Fonseca-Pinto was financed by the projects UIDB/50008/2020, UIDP/50008/2020, UIDB/05704/2020 and UIDP/05704/2020 and C. V. Nogueira was financed by the projects UIDB/00013/2020 and UIDP/00013/2020. The funding agencies did not have any further involvement in this paper.
Convolutional neural networks have been effective in several applications, arising as a promising supporting tool in a relevant Dermatology problem: skin cancer diagnosis. However, generalizing well can be difficult when little training data is available. The fine-tuning transfer learning strategy has been employed to differentiate properly malignant from non-malignant lesions in dermoscopic images. Fine-tuning a pre-trained network allows one to classify data in the target domain, occasionally with few images, using knowledge acquired in another domain. This work proposes eight fine-tuning settings based on convolutional networks previously trained on ImageNet that can be employed mainly in limited data samples to reduce overfitting risk. They differ on the architecture, the learning rate and the number of unfrozen layer blocks. We evaluated the settings in two public datasets with 104 and 200 dermoscopic images. By finding competitive configurations in small datasets, this paper illustrates that deep learning can be effective if one has only a few dozen malignant and non-malignant lesion images to study and differentiate in Dermatology. The proposal is also flexible and potentially useful for other domains. In fact, it performed satisfactorily in an assessment conducted in a larger dataset with 746 computerized tomographic images associated with the coronavirus disease.</abstract><cop>New York</cop><pub>Springer</pub><doi>10.1007/s11042-023-16529-w</doi><tpages>25</tpages><orcidid>https://orcid.org/0000-0001-6774-5363</orcidid><orcidid>https://orcid.org/0000-0002-9269-2221</orcidid><orcidid>https://orcid.org/0000-0003-0748-3693</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1573-7721 |
ispartof | Multimedia tools and applications, 2024-03, Vol.83 (9), p.27305-27329 |
issn | 1573-7721 1380-7501 1573-7721 |
language | eng |
recordid | cdi_proquest_journals_2933269661 |
source | SpringerLink Journals - AutoHoldings |
subjects | Artificial neural networks Computer Communication Networks Computer Science Data Structures and Information Theory Datasets Deep learning Dermatology Feature learning Image acquisition Image classification Knowledge acquisition Lesions Machine learning Medical imaging Multimedia Information Systems Neural networks RMSprop Shallow learning Skin cancer Special Purpose and Application-Based Systems Statistical test Track 2: Medical Applications of Multimedia VGG |
title | Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-05T20%3A38%3A15IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Fine-tuning%20pre-trained%20neural%20networks%20for%20medical%20image%20classification%20in%20small%20clinical%20datasets&rft.jtitle=Multimedia%20tools%20and%20applications&rft.au=Spola%C3%B4r,%20Newton&rft.date=2024-03-01&rft.volume=83&rft.issue=9&rft.spage=27305&rft.epage=27329&rft.pages=27305-27329&rft.issn=1573-7721&rft.eissn=1573-7721&rft_id=info:doi/10.1007/s11042-023-16529-w&rft_dat=%3Cproquest_cross%3E2933269661%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2933269661&rft_id=info:pmid/&rfr_iscdi=true |