S 2 C-DeLeNet: A parameter transfer based segmentation-classification integration for detecting skin cancer lesions from dermoscopic images

Dermoscopic images ideally depict pigmentation attributes on the skin surface which is highly regarded in the medical community for detection of skin abnormality, disease or even cancer. The identification of such abnormality, however, requires trained eyes and accurate detection necessitates the pr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers in biology and medicine 2022-11, Vol.150, p.106148, Article 106148
Hauptverfasser: Alam, Md Jahin, Mohammad, Mir Sayeed, Hossain, Md Adnan Faisal, Showmik, Ishtiaque Ahmed, Raihan, Munshi Sanowar, Ahmed, Shahed, Mahmud, Talha Ibn
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page 106148
container_title Computers in biology and medicine
container_volume 150
creator Alam, Md Jahin
Mohammad, Mir Sayeed
Hossain, Md Adnan Faisal
Showmik, Ishtiaque Ahmed
Raihan, Munshi Sanowar
Ahmed, Shahed
Mahmud, Talha Ibn
description Dermoscopic images ideally depict pigmentation attributes on the skin surface which is highly regarded in the medical community for detection of skin abnormality, disease or even cancer. The identification of such abnormality, however, requires trained eyes and accurate detection necessitates the process being time-intensive. As such, computerized detection schemes have become quite an essential, especially schemes which adopt deep learning tactics. In this paper, a convolutional deep neural network, S C-DeLeNet, is proposed, which (i) Performs segmentation procedure of lesion based regions with respect to the unaffected skin tissue from dermoscopic images using a segmentation sub-network, (ii) Classifies each image based on its medical condition type utilizing transferred parameters from the inherent segmentation sub-network. The architecture of the segmentation sub-network contains EfficientNet-B4 backbone in place of the encoder and the classification sub-network bears a 'Classification Feature Extraction' system which pulls trained segmentation feature maps towards lesion prediction. Inside the classification architecture, there have been designed, (i) A 'Feature Coalescing Module' in order to trail and mix each dimensional feature from both encoder and decoder, (ii) A '3D-Layer Residuals' block to create a parallel pathway of low-dimensional features with high variance for better classification. After fine-tuning on a publicly accessible dataset, a mean dice-score of 0.9494 during segmentation is procured which beats existing segmentation strategies and a mean accuracy of 0.9103 is obtained for classification which outperforms conventional and noted classifiers. Additionally, the already fine-tuned network demonstrates highly satisfactory results on other skin cancer segmentation datasets while cross-inference. Extensive experimentation is done to prove the efficacy of the network for not only dermoscopic images but also different medical modalities; which can show its potential in being a systematic diagnostic solution in the field of dermatology and possibly more.
doi_str_mv 10.1016/j.compbiomed.2022.106148
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2729957970</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2729957970</sourcerecordid><originalsourceid>FETCH-LOGICAL-c1553-1ed0760d9e3b7210657f54f3e20cef4483ad3b37330538f0f08a3fd3556728143</originalsourceid><addsrcrecordid>eNpFUclu2zAQJYoUjZvkFwoCOcsdckRRyi1w0gUw2kPbM0FRQ4OOJSqkfOg39KdL1wl6mu3Nm-UxxgWsBYjm437t4jj3IY40rCVIWdKNqNs3bCVa3VWgsL5gKwABVd1Kdcne57wHgBoQ3rFLbKSS2OCK_fnBJd9UD7Slb7Tc8Xs-22RHWijxJdkp--L0NtPAM-1Gmha7hDhV7mBzDj64fyEP00K7dPZ9THwoBG4J047npzBxZydXeA6UCyBzn-JYIGmM2cU5OB5Gu6N8zd56e8h082Kv2K9Pjz83X6rt989fN_fbygmlsBI0gG5g6Ah7LcvhSntVeyQJjnxdt2gH7FEjlje0Hjy0Fv2ASjVatqLGK3Z75p1TfD5SXsw-HtNURhqpZdcp3WkoqPaMcinmnMibOZU9028jwJxUMHvzXwVzUsGcVSitH14GHPtT7bXx9e34F5PHh70</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2729957970</pqid></control><display><type>article</type><title>S 2 C-DeLeNet: A parameter transfer based segmentation-classification integration for detecting skin cancer lesions from dermoscopic images</title><source>MEDLINE</source><source>Elsevier ScienceDirect Journals</source><creator>Alam, Md Jahin ; Mohammad, Mir Sayeed ; Hossain, Md Adnan Faisal ; Showmik, Ishtiaque Ahmed ; Raihan, Munshi Sanowar ; Ahmed, Shahed ; Mahmud, Talha Ibn</creator><creatorcontrib>Alam, Md Jahin ; Mohammad, Mir Sayeed ; Hossain, Md Adnan Faisal ; Showmik, Ishtiaque Ahmed ; Raihan, Munshi Sanowar ; Ahmed, Shahed ; Mahmud, Talha Ibn</creatorcontrib><description>Dermoscopic images ideally depict pigmentation attributes on the skin surface which is highly regarded in the medical community for detection of skin abnormality, disease or even cancer. The identification of such abnormality, however, requires trained eyes and accurate detection necessitates the process being time-intensive. As such, computerized detection schemes have become quite an essential, especially schemes which adopt deep learning tactics. In this paper, a convolutional deep neural network, S C-DeLeNet, is proposed, which (i) Performs segmentation procedure of lesion based regions with respect to the unaffected skin tissue from dermoscopic images using a segmentation sub-network, (ii) Classifies each image based on its medical condition type utilizing transferred parameters from the inherent segmentation sub-network. The architecture of the segmentation sub-network contains EfficientNet-B4 backbone in place of the encoder and the classification sub-network bears a 'Classification Feature Extraction' system which pulls trained segmentation feature maps towards lesion prediction. Inside the classification architecture, there have been designed, (i) A 'Feature Coalescing Module' in order to trail and mix each dimensional feature from both encoder and decoder, (ii) A '3D-Layer Residuals' block to create a parallel pathway of low-dimensional features with high variance for better classification. After fine-tuning on a publicly accessible dataset, a mean dice-score of 0.9494 during segmentation is procured which beats existing segmentation strategies and a mean accuracy of 0.9103 is obtained for classification which outperforms conventional and noted classifiers. Additionally, the already fine-tuned network demonstrates highly satisfactory results on other skin cancer segmentation datasets while cross-inference. Extensive experimentation is done to prove the efficacy of the network for not only dermoscopic images but also different medical modalities; which can show its potential in being a systematic diagnostic solution in the field of dermatology and possibly more.</description><identifier>ISSN: 0010-4825</identifier><identifier>EISSN: 1879-0534</identifier><identifier>DOI: 10.1016/j.compbiomed.2022.106148</identifier><identifier>PMID: 36252363</identifier><language>eng</language><publisher>United States: Elsevier Limited</publisher><subject>Artificial neural networks ; Cancer ; Classification ; Coders ; Datasets ; Deep learning ; Dermatology ; Dermoscopy - methods ; Experimentation ; Feature extraction ; Feature maps ; Humans ; Image classification ; Image processing ; Image Processing, Computer-Assisted - methods ; Image segmentation ; Lesions ; Machine learning ; Medical imaging ; Neural networks ; Neural Networks, Computer ; Parameters ; Pigmentation ; Skin ; Skin - diagnostic imaging ; Skin cancer ; Skin diseases ; Skin Neoplasms - diagnostic imaging ; Tactics</subject><ispartof>Computers in biology and medicine, 2022-11, Vol.150, p.106148, Article 106148</ispartof><rights>Copyright © 2022 Elsevier Ltd. All rights reserved.</rights><rights>2022. Elsevier Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c1553-1ed0760d9e3b7210657f54f3e20cef4483ad3b37330538f0f08a3fd3556728143</citedby><cites>FETCH-LOGICAL-c1553-1ed0760d9e3b7210657f54f3e20cef4483ad3b37330538f0f08a3fd3556728143</cites><orcidid>0000-0002-9296-1890 ; 0000-0003-0978-019X ; 0000-0002-2922-8696 ; 0000-0001-9409-1940 ; 0000-0002-6684-4228 ; 0000-0003-1177-9964</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36252363$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Alam, Md Jahin</creatorcontrib><creatorcontrib>Mohammad, Mir Sayeed</creatorcontrib><creatorcontrib>Hossain, Md Adnan Faisal</creatorcontrib><creatorcontrib>Showmik, Ishtiaque Ahmed</creatorcontrib><creatorcontrib>Raihan, Munshi Sanowar</creatorcontrib><creatorcontrib>Ahmed, Shahed</creatorcontrib><creatorcontrib>Mahmud, Talha Ibn</creatorcontrib><title>S 2 C-DeLeNet: A parameter transfer based segmentation-classification integration for detecting skin cancer lesions from dermoscopic images</title><title>Computers in biology and medicine</title><addtitle>Comput Biol Med</addtitle><description>Dermoscopic images ideally depict pigmentation attributes on the skin surface which is highly regarded in the medical community for detection of skin abnormality, disease or even cancer. The identification of such abnormality, however, requires trained eyes and accurate detection necessitates the process being time-intensive. As such, computerized detection schemes have become quite an essential, especially schemes which adopt deep learning tactics. In this paper, a convolutional deep neural network, S C-DeLeNet, is proposed, which (i) Performs segmentation procedure of lesion based regions with respect to the unaffected skin tissue from dermoscopic images using a segmentation sub-network, (ii) Classifies each image based on its medical condition type utilizing transferred parameters from the inherent segmentation sub-network. The architecture of the segmentation sub-network contains EfficientNet-B4 backbone in place of the encoder and the classification sub-network bears a 'Classification Feature Extraction' system which pulls trained segmentation feature maps towards lesion prediction. Inside the classification architecture, there have been designed, (i) A 'Feature Coalescing Module' in order to trail and mix each dimensional feature from both encoder and decoder, (ii) A '3D-Layer Residuals' block to create a parallel pathway of low-dimensional features with high variance for better classification. After fine-tuning on a publicly accessible dataset, a mean dice-score of 0.9494 during segmentation is procured which beats existing segmentation strategies and a mean accuracy of 0.9103 is obtained for classification which outperforms conventional and noted classifiers. Additionally, the already fine-tuned network demonstrates highly satisfactory results on other skin cancer segmentation datasets while cross-inference. Extensive experimentation is done to prove the efficacy of the network for not only dermoscopic images but also different medical modalities; which can show its potential in being a systematic diagnostic solution in the field of dermatology and possibly more.</description><subject>Artificial neural networks</subject><subject>Cancer</subject><subject>Classification</subject><subject>Coders</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Dermatology</subject><subject>Dermoscopy - methods</subject><subject>Experimentation</subject><subject>Feature extraction</subject><subject>Feature maps</subject><subject>Humans</subject><subject>Image classification</subject><subject>Image processing</subject><subject>Image Processing, Computer-Assisted - methods</subject><subject>Image segmentation</subject><subject>Lesions</subject><subject>Machine learning</subject><subject>Medical imaging</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Parameters</subject><subject>Pigmentation</subject><subject>Skin</subject><subject>Skin - diagnostic imaging</subject><subject>Skin cancer</subject><subject>Skin diseases</subject><subject>Skin Neoplasms - diagnostic imaging</subject><subject>Tactics</subject><issn>0010-4825</issn><issn>1879-0534</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>8G5</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNpFUclu2zAQJYoUjZvkFwoCOcsdckRRyi1w0gUw2kPbM0FRQ4OOJSqkfOg39KdL1wl6mu3Nm-UxxgWsBYjm437t4jj3IY40rCVIWdKNqNs3bCVa3VWgsL5gKwABVd1Kdcne57wHgBoQ3rFLbKSS2OCK_fnBJd9UD7Slb7Tc8Xs-22RHWijxJdkp--L0NtPAM-1Gmha7hDhV7mBzDj64fyEP00K7dPZ9THwoBG4J047npzBxZydXeA6UCyBzn-JYIGmM2cU5OB5Gu6N8zd56e8h082Kv2K9Pjz83X6rt989fN_fbygmlsBI0gG5g6Ah7LcvhSntVeyQJjnxdt2gH7FEjlje0Hjy0Fv2ASjVatqLGK3Z75p1TfD5SXsw-HtNURhqpZdcp3WkoqPaMcinmnMibOZU9028jwJxUMHvzXwVzUsGcVSitH14GHPtT7bXx9e34F5PHh70</recordid><startdate>202211</startdate><enddate>202211</enddate><creator>Alam, Md Jahin</creator><creator>Mohammad, Mir Sayeed</creator><creator>Hossain, Md Adnan Faisal</creator><creator>Showmik, Ishtiaque Ahmed</creator><creator>Raihan, Munshi Sanowar</creator><creator>Ahmed, Shahed</creator><creator>Mahmud, Talha Ibn</creator><general>Elsevier Limited</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KB0</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M2O</scope><scope>M7P</scope><scope>M7Z</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-9296-1890</orcidid><orcidid>https://orcid.org/0000-0003-0978-019X</orcidid><orcidid>https://orcid.org/0000-0002-2922-8696</orcidid><orcidid>https://orcid.org/0000-0001-9409-1940</orcidid><orcidid>https://orcid.org/0000-0002-6684-4228</orcidid><orcidid>https://orcid.org/0000-0003-1177-9964</orcidid></search><sort><creationdate>202211</creationdate><title>S 2 C-DeLeNet: A parameter transfer based segmentation-classification integration for detecting skin cancer lesions from dermoscopic images</title><author>Alam, Md Jahin ; Mohammad, Mir Sayeed ; Hossain, Md Adnan Faisal ; Showmik, Ishtiaque Ahmed ; Raihan, Munshi Sanowar ; Ahmed, Shahed ; Mahmud, Talha Ibn</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c1553-1ed0760d9e3b7210657f54f3e20cef4483ad3b37330538f0f08a3fd3556728143</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Artificial neural networks</topic><topic>Cancer</topic><topic>Classification</topic><topic>Coders</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Dermatology</topic><topic>Dermoscopy - methods</topic><topic>Experimentation</topic><topic>Feature extraction</topic><topic>Feature maps</topic><topic>Humans</topic><topic>Image classification</topic><topic>Image processing</topic><topic>Image Processing, Computer-Assisted - methods</topic><topic>Image segmentation</topic><topic>Lesions</topic><topic>Machine learning</topic><topic>Medical imaging</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Parameters</topic><topic>Pigmentation</topic><topic>Skin</topic><topic>Skin - diagnostic imaging</topic><topic>Skin cancer</topic><topic>Skin diseases</topic><topic>Skin Neoplasms - diagnostic imaging</topic><topic>Tactics</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Alam, Md Jahin</creatorcontrib><creatorcontrib>Mohammad, Mir Sayeed</creatorcontrib><creatorcontrib>Hossain, Md Adnan Faisal</creatorcontrib><creatorcontrib>Showmik, Ishtiaque Ahmed</creatorcontrib><creatorcontrib>Raihan, Munshi Sanowar</creatorcontrib><creatorcontrib>Ahmed, Shahed</creatorcontrib><creatorcontrib>Mahmud, Talha Ibn</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection (ProQuest)</collection><collection>Natural Science Collection (ProQuest)</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Research Library</collection><collection>Biological Science Database</collection><collection>Biochemistry Abstracts 1</collection><collection>Research Library (Corporate)</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><jtitle>Computers in biology and medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Alam, Md Jahin</au><au>Mohammad, Mir Sayeed</au><au>Hossain, Md Adnan Faisal</au><au>Showmik, Ishtiaque Ahmed</au><au>Raihan, Munshi Sanowar</au><au>Ahmed, Shahed</au><au>Mahmud, Talha Ibn</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>S 2 C-DeLeNet: A parameter transfer based segmentation-classification integration for detecting skin cancer lesions from dermoscopic images</atitle><jtitle>Computers in biology and medicine</jtitle><addtitle>Comput Biol Med</addtitle><date>2022-11</date><risdate>2022</risdate><volume>150</volume><spage>106148</spage><pages>106148-</pages><artnum>106148</artnum><issn>0010-4825</issn><eissn>1879-0534</eissn><abstract>Dermoscopic images ideally depict pigmentation attributes on the skin surface which is highly regarded in the medical community for detection of skin abnormality, disease or even cancer. The identification of such abnormality, however, requires trained eyes and accurate detection necessitates the process being time-intensive. As such, computerized detection schemes have become quite an essential, especially schemes which adopt deep learning tactics. In this paper, a convolutional deep neural network, S C-DeLeNet, is proposed, which (i) Performs segmentation procedure of lesion based regions with respect to the unaffected skin tissue from dermoscopic images using a segmentation sub-network, (ii) Classifies each image based on its medical condition type utilizing transferred parameters from the inherent segmentation sub-network. The architecture of the segmentation sub-network contains EfficientNet-B4 backbone in place of the encoder and the classification sub-network bears a 'Classification Feature Extraction' system which pulls trained segmentation feature maps towards lesion prediction. Inside the classification architecture, there have been designed, (i) A 'Feature Coalescing Module' in order to trail and mix each dimensional feature from both encoder and decoder, (ii) A '3D-Layer Residuals' block to create a parallel pathway of low-dimensional features with high variance for better classification. After fine-tuning on a publicly accessible dataset, a mean dice-score of 0.9494 during segmentation is procured which beats existing segmentation strategies and a mean accuracy of 0.9103 is obtained for classification which outperforms conventional and noted classifiers. Additionally, the already fine-tuned network demonstrates highly satisfactory results on other skin cancer segmentation datasets while cross-inference. Extensive experimentation is done to prove the efficacy of the network for not only dermoscopic images but also different medical modalities; which can show its potential in being a systematic diagnostic solution in the field of dermatology and possibly more.</abstract><cop>United States</cop><pub>Elsevier Limited</pub><pmid>36252363</pmid><doi>10.1016/j.compbiomed.2022.106148</doi><orcidid>https://orcid.org/0000-0002-9296-1890</orcidid><orcidid>https://orcid.org/0000-0003-0978-019X</orcidid><orcidid>https://orcid.org/0000-0002-2922-8696</orcidid><orcidid>https://orcid.org/0000-0001-9409-1940</orcidid><orcidid>https://orcid.org/0000-0002-6684-4228</orcidid><orcidid>https://orcid.org/0000-0003-1177-9964</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0010-4825
ispartof Computers in biology and medicine, 2022-11, Vol.150, p.106148, Article 106148
issn 0010-4825
1879-0534
language eng
recordid cdi_proquest_journals_2729957970
source MEDLINE; Elsevier ScienceDirect Journals
subjects Artificial neural networks
Cancer
Classification
Coders
Datasets
Deep learning
Dermatology
Dermoscopy - methods
Experimentation
Feature extraction
Feature maps
Humans
Image classification
Image processing
Image Processing, Computer-Assisted - methods
Image segmentation
Lesions
Machine learning
Medical imaging
Neural networks
Neural Networks, Computer
Parameters
Pigmentation
Skin
Skin - diagnostic imaging
Skin cancer
Skin diseases
Skin Neoplasms - diagnostic imaging
Tactics
title S 2 C-DeLeNet: A parameter transfer based segmentation-classification integration for detecting skin cancer lesions from dermoscopic images
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T10%3A11%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=S%202%20C-DeLeNet:%20A%20parameter%20transfer%20based%20segmentation-classification%20integration%20for%20detecting%20skin%20cancer%20lesions%20from%20dermoscopic%20images&rft.jtitle=Computers%20in%20biology%20and%20medicine&rft.au=Alam,%20Md%20Jahin&rft.date=2022-11&rft.volume=150&rft.spage=106148&rft.pages=106148-&rft.artnum=106148&rft.issn=0010-4825&rft.eissn=1879-0534&rft_id=info:doi/10.1016/j.compbiomed.2022.106148&rft_dat=%3Cproquest_cross%3E2729957970%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2729957970&rft_id=info:pmid/36252363&rfr_iscdi=true