DSNet: Automatic dermoscopic skin lesion segmentation

Automatic segmentation of skin lesions is considered a crucial step in Computer-aided Diagnosis (CAD) systems for melanoma detection. Despite its significance, skin lesion segmentation remains an unsolved challenge due to their variability in color, texture, and shapes and indistinguishable boundari...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers in biology and medicine 2020-05, Vol.120, p.103738-103738, Article 103738
Hauptverfasser: Hasan, Md. Kamrul, Dahal, Lavsen, Samarakoon, Prasad N., Tushar, Fakrul Islam, Martí, Robert
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 103738
container_issue
container_start_page 103738
container_title Computers in biology and medicine
container_volume 120
creator Hasan, Md. Kamrul
Dahal, Lavsen
Samarakoon, Prasad N.
Tushar, Fakrul Islam
Martí, Robert
description Automatic segmentation of skin lesions is considered a crucial step in Computer-aided Diagnosis (CAD) systems for melanoma detection. Despite its significance, skin lesion segmentation remains an unsolved challenge due to their variability in color, texture, and shapes and indistinguishable boundaries. Through this study, we present a new and automatic semantic segmentation network for robust skin lesion segmentation named Dermoscopic Skin Network (DSNet). In order to reduce the number of parameters to make the network lightweight, we used a depth-wise separable convolution in lieu of standard convolution to project the learned discriminating features onto the pixel space at different stages of the encoder. Additionally, we implemented both a U-Net and a Fully Convolutional Network (FCN8s) to compare against the proposed DSNet. We evaluate our proposed model on two publicly available datasets, namely ISIC-201711https://challenge.kitware.com/#challenge/583f126bcad3a51cc66c8d9a. and PH222https://www.fc.up.pt/addi/ph2%20database.html.. The obtained mean Intersection over Union (mIoU) is 77.5% and 87.0% respectively for ISIC-2017 and PH2 datasets which outperformed the ISIC-2017 challenge winner by 1.0% with respect to mIoU. Our proposed network also outperformed U-Net and FCN8s respectively by 3.6% and 6.8% with respect to mIoU on the ISIC-2017 dataset. Our network for skin lesion segmentation outperforms the other methods discussed in the article and is able to provide better-segmented masks on two different test datasets which can lead to better performance in melanoma detection. Our trained model along with the source code and predicted masks are made publicly available33https://github.com/kamruleee51/Skin-Lesion-Segmentation-Using-Proposed-DSNet.. [Display omitted] •DSNet eliminates the necessity of learning redundant features and vanishing-gradient problems.•A considerable lightweight structure (fewer parameters, shorter training, and testing times).•The proposed hybrid loss function maximizes the overlapping between the true and predicted mask.•Transfer learning and image augmentation are used to build a generic DSNet although a small dataset is being used.•The proposed DSNet can precisely segment the lesion and is robust to hair fibers and other artifacts.
doi_str_mv 10.1016/j.compbiomed.2020.103738
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2404638543</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0010482520301190</els_id><sourcerecordid>2425664167</sourcerecordid><originalsourceid>FETCH-LOGICAL-c402t-b4b5e5eddd0e866c76a821e31673d400f3dbd395314cabc770d98f22f52c9d23</originalsourceid><addsrcrecordid>eNqFkE1Lw0AQhhdRbK3-BQl48ZI6-5nUW62fUPRg70uyO5GtTbbuJoL_3oRWBC-eZpl9Zt7hISShMKVA1dV6any9LZ2v0U4ZsKHNM54fkDHNs1kKkotDMgagkIqcyRE5iXENAAI4HJMRZ4JRJcSYyNvXZ2yvk3nX-rponUkshtpH47f9O767JtlgdL5JIr7V2LQ945tTclQVm4hn-zohq_u71eIxXb48PC3my9QIYG1ailKiRGstYK6UyVSRM4qcqoxbAVBxW1o-k5wKU5Qmy8DO8oqxSjIzs4xPyOVu7Tb4jw5jq2sXDW42RYO-i5oJEIrnUvAevfiDrn0Xmv64nmJSKTGETki-o0zwMQas9Da4ughfmoIezOq1_jWrB7N6Z7YfPd8HdOXw9zP4o7IHbnYA9kI-HQYdjcPGoHUBTautd_-nfAO5T41l</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2425664167</pqid></control><display><type>article</type><title>DSNet: Automatic dermoscopic skin lesion segmentation</title><source>Elsevier ScienceDirect Journals Complete</source><creator>Hasan, Md. Kamrul ; Dahal, Lavsen ; Samarakoon, Prasad N. ; Tushar, Fakrul Islam ; Martí, Robert</creator><creatorcontrib>Hasan, Md. Kamrul ; Dahal, Lavsen ; Samarakoon, Prasad N. ; Tushar, Fakrul Islam ; Martí, Robert</creatorcontrib><description>Automatic segmentation of skin lesions is considered a crucial step in Computer-aided Diagnosis (CAD) systems for melanoma detection. Despite its significance, skin lesion segmentation remains an unsolved challenge due to their variability in color, texture, and shapes and indistinguishable boundaries. Through this study, we present a new and automatic semantic segmentation network for robust skin lesion segmentation named Dermoscopic Skin Network (DSNet). In order to reduce the number of parameters to make the network lightweight, we used a depth-wise separable convolution in lieu of standard convolution to project the learned discriminating features onto the pixel space at different stages of the encoder. Additionally, we implemented both a U-Net and a Fully Convolutional Network (FCN8s) to compare against the proposed DSNet. We evaluate our proposed model on two publicly available datasets, namely ISIC-201711https://challenge.kitware.com/#challenge/583f126bcad3a51cc66c8d9a. and PH222https://www.fc.up.pt/addi/ph2%20database.html.. The obtained mean Intersection over Union (mIoU) is 77.5% and 87.0% respectively for ISIC-2017 and PH2 datasets which outperformed the ISIC-2017 challenge winner by 1.0% with respect to mIoU. Our proposed network also outperformed U-Net and FCN8s respectively by 3.6% and 6.8% with respect to mIoU on the ISIC-2017 dataset. Our network for skin lesion segmentation outperforms the other methods discussed in the article and is able to provide better-segmented masks on two different test datasets which can lead to better performance in melanoma detection. Our trained model along with the source code and predicted masks are made publicly available33https://github.com/kamruleee51/Skin-Lesion-Segmentation-Using-Proposed-DSNet.. [Display omitted] •DSNet eliminates the necessity of learning redundant features and vanishing-gradient problems.•A considerable lightweight structure (fewer parameters, shorter training, and testing times).•The proposed hybrid loss function maximizes the overlapping between the true and predicted mask.•Transfer learning and image augmentation are used to build a generic DSNet although a small dataset is being used.•The proposed DSNet can precisely segment the lesion and is robust to hair fibers and other artifacts.</description><identifier>ISSN: 0010-4825</identifier><identifier>EISSN: 1879-0534</identifier><identifier>DOI: 10.1016/j.compbiomed.2020.103738</identifier><identifier>PMID: 32421644</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Coders ; Computer-aided Diagnosis (CAD) ; Convolution ; Datasets ; Deep learning ; Lesions ; Masks ; Medical diagnosis ; Melanoma ; Melanoma detection ; Semantic segmentation ; Skin diseases ; Skin lesion segmentation ; Skin lesions ; Source code</subject><ispartof>Computers in biology and medicine, 2020-05, Vol.120, p.103738-103738, Article 103738</ispartof><rights>2020 Elsevier Ltd</rights><rights>Copyright © 2020 Elsevier Ltd. All rights reserved.</rights><rights>2020. Elsevier Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c402t-b4b5e5eddd0e866c76a821e31673d400f3dbd395314cabc770d98f22f52c9d23</citedby><cites>FETCH-LOGICAL-c402t-b4b5e5eddd0e866c76a821e31673d400f3dbd395314cabc770d98f22f52c9d23</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0010482520301190$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32421644$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Hasan, Md. Kamrul</creatorcontrib><creatorcontrib>Dahal, Lavsen</creatorcontrib><creatorcontrib>Samarakoon, Prasad N.</creatorcontrib><creatorcontrib>Tushar, Fakrul Islam</creatorcontrib><creatorcontrib>Martí, Robert</creatorcontrib><title>DSNet: Automatic dermoscopic skin lesion segmentation</title><title>Computers in biology and medicine</title><addtitle>Comput Biol Med</addtitle><description>Automatic segmentation of skin lesions is considered a crucial step in Computer-aided Diagnosis (CAD) systems for melanoma detection. Despite its significance, skin lesion segmentation remains an unsolved challenge due to their variability in color, texture, and shapes and indistinguishable boundaries. Through this study, we present a new and automatic semantic segmentation network for robust skin lesion segmentation named Dermoscopic Skin Network (DSNet). In order to reduce the number of parameters to make the network lightweight, we used a depth-wise separable convolution in lieu of standard convolution to project the learned discriminating features onto the pixel space at different stages of the encoder. Additionally, we implemented both a U-Net and a Fully Convolutional Network (FCN8s) to compare against the proposed DSNet. We evaluate our proposed model on two publicly available datasets, namely ISIC-201711https://challenge.kitware.com/#challenge/583f126bcad3a51cc66c8d9a. and PH222https://www.fc.up.pt/addi/ph2%20database.html.. The obtained mean Intersection over Union (mIoU) is 77.5% and 87.0% respectively for ISIC-2017 and PH2 datasets which outperformed the ISIC-2017 challenge winner by 1.0% with respect to mIoU. Our proposed network also outperformed U-Net and FCN8s respectively by 3.6% and 6.8% with respect to mIoU on the ISIC-2017 dataset. Our network for skin lesion segmentation outperforms the other methods discussed in the article and is able to provide better-segmented masks on two different test datasets which can lead to better performance in melanoma detection. Our trained model along with the source code and predicted masks are made publicly available33https://github.com/kamruleee51/Skin-Lesion-Segmentation-Using-Proposed-DSNet.. [Display omitted] •DSNet eliminates the necessity of learning redundant features and vanishing-gradient problems.•A considerable lightweight structure (fewer parameters, shorter training, and testing times).•The proposed hybrid loss function maximizes the overlapping between the true and predicted mask.•Transfer learning and image augmentation are used to build a generic DSNet although a small dataset is being used.•The proposed DSNet can precisely segment the lesion and is robust to hair fibers and other artifacts.</description><subject>Coders</subject><subject>Computer-aided Diagnosis (CAD)</subject><subject>Convolution</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Lesions</subject><subject>Masks</subject><subject>Medical diagnosis</subject><subject>Melanoma</subject><subject>Melanoma detection</subject><subject>Semantic segmentation</subject><subject>Skin diseases</subject><subject>Skin lesion segmentation</subject><subject>Skin lesions</subject><subject>Source code</subject><issn>0010-4825</issn><issn>1879-0534</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqFkE1Lw0AQhhdRbK3-BQl48ZI6-5nUW62fUPRg70uyO5GtTbbuJoL_3oRWBC-eZpl9Zt7hISShMKVA1dV6any9LZ2v0U4ZsKHNM54fkDHNs1kKkotDMgagkIqcyRE5iXENAAI4HJMRZ4JRJcSYyNvXZ2yvk3nX-rponUkshtpH47f9O767JtlgdL5JIr7V2LQ945tTclQVm4hn-zohq_u71eIxXb48PC3my9QIYG1ailKiRGstYK6UyVSRM4qcqoxbAVBxW1o-k5wKU5Qmy8DO8oqxSjIzs4xPyOVu7Tb4jw5jq2sXDW42RYO-i5oJEIrnUvAevfiDrn0Xmv64nmJSKTGETki-o0zwMQas9Da4ughfmoIezOq1_jWrB7N6Z7YfPd8HdOXw9zP4o7IHbnYA9kI-HQYdjcPGoHUBTautd_-nfAO5T41l</recordid><startdate>202005</startdate><enddate>202005</enddate><creator>Hasan, Md. Kamrul</creator><creator>Dahal, Lavsen</creator><creator>Samarakoon, Prasad N.</creator><creator>Tushar, Fakrul Islam</creator><creator>Martí, Robert</creator><general>Elsevier Ltd</general><general>Elsevier Limited</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KB0</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M2O</scope><scope>M7P</scope><scope>M7Z</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope></search><sort><creationdate>202005</creationdate><title>DSNet: Automatic dermoscopic skin lesion segmentation</title><author>Hasan, Md. Kamrul ; Dahal, Lavsen ; Samarakoon, Prasad N. ; Tushar, Fakrul Islam ; Martí, Robert</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c402t-b4b5e5eddd0e866c76a821e31673d400f3dbd395314cabc770d98f22f52c9d23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Coders</topic><topic>Computer-aided Diagnosis (CAD)</topic><topic>Convolution</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Lesions</topic><topic>Masks</topic><topic>Medical diagnosis</topic><topic>Melanoma</topic><topic>Melanoma detection</topic><topic>Semantic segmentation</topic><topic>Skin diseases</topic><topic>Skin lesion segmentation</topic><topic>Skin lesions</topic><topic>Source code</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hasan, Md. Kamrul</creatorcontrib><creatorcontrib>Dahal, Lavsen</creatorcontrib><creatorcontrib>Samarakoon, Prasad N.</creatorcontrib><creatorcontrib>Tushar, Fakrul Islam</creatorcontrib><creatorcontrib>Martí, Robert</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Research Library</collection><collection>Biological Science Database</collection><collection>Biochemistry Abstracts 1</collection><collection>Research Library (Corporate)</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><jtitle>Computers in biology and medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hasan, Md. Kamrul</au><au>Dahal, Lavsen</au><au>Samarakoon, Prasad N.</au><au>Tushar, Fakrul Islam</au><au>Martí, Robert</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>DSNet: Automatic dermoscopic skin lesion segmentation</atitle><jtitle>Computers in biology and medicine</jtitle><addtitle>Comput Biol Med</addtitle><date>2020-05</date><risdate>2020</risdate><volume>120</volume><spage>103738</spage><epage>103738</epage><pages>103738-103738</pages><artnum>103738</artnum><issn>0010-4825</issn><eissn>1879-0534</eissn><abstract>Automatic segmentation of skin lesions is considered a crucial step in Computer-aided Diagnosis (CAD) systems for melanoma detection. Despite its significance, skin lesion segmentation remains an unsolved challenge due to their variability in color, texture, and shapes and indistinguishable boundaries. Through this study, we present a new and automatic semantic segmentation network for robust skin lesion segmentation named Dermoscopic Skin Network (DSNet). In order to reduce the number of parameters to make the network lightweight, we used a depth-wise separable convolution in lieu of standard convolution to project the learned discriminating features onto the pixel space at different stages of the encoder. Additionally, we implemented both a U-Net and a Fully Convolutional Network (FCN8s) to compare against the proposed DSNet. We evaluate our proposed model on two publicly available datasets, namely ISIC-201711https://challenge.kitware.com/#challenge/583f126bcad3a51cc66c8d9a. and PH222https://www.fc.up.pt/addi/ph2%20database.html.. The obtained mean Intersection over Union (mIoU) is 77.5% and 87.0% respectively for ISIC-2017 and PH2 datasets which outperformed the ISIC-2017 challenge winner by 1.0% with respect to mIoU. Our proposed network also outperformed U-Net and FCN8s respectively by 3.6% and 6.8% with respect to mIoU on the ISIC-2017 dataset. Our network for skin lesion segmentation outperforms the other methods discussed in the article and is able to provide better-segmented masks on two different test datasets which can lead to better performance in melanoma detection. Our trained model along with the source code and predicted masks are made publicly available33https://github.com/kamruleee51/Skin-Lesion-Segmentation-Using-Proposed-DSNet.. [Display omitted] •DSNet eliminates the necessity of learning redundant features and vanishing-gradient problems.•A considerable lightweight structure (fewer parameters, shorter training, and testing times).•The proposed hybrid loss function maximizes the overlapping between the true and predicted mask.•Transfer learning and image augmentation are used to build a generic DSNet although a small dataset is being used.•The proposed DSNet can precisely segment the lesion and is robust to hair fibers and other artifacts.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>32421644</pmid><doi>10.1016/j.compbiomed.2020.103738</doi><tpages>1</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0010-4825
ispartof Computers in biology and medicine, 2020-05, Vol.120, p.103738-103738, Article 103738
issn 0010-4825
1879-0534
language eng
recordid cdi_proquest_miscellaneous_2404638543
source Elsevier ScienceDirect Journals Complete
subjects Coders
Computer-aided Diagnosis (CAD)
Convolution
Datasets
Deep learning
Lesions
Masks
Medical diagnosis
Melanoma
Melanoma detection
Semantic segmentation
Skin diseases
Skin lesion segmentation
Skin lesions
Source code
title DSNet: Automatic dermoscopic skin lesion segmentation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T00%3A11%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=DSNet:%20Automatic%20dermoscopic%20skin%20lesion%20segmentation&rft.jtitle=Computers%20in%20biology%20and%20medicine&rft.au=Hasan,%20Md.%20Kamrul&rft.date=2020-05&rft.volume=120&rft.spage=103738&rft.epage=103738&rft.pages=103738-103738&rft.artnum=103738&rft.issn=0010-4825&rft.eissn=1879-0534&rft_id=info:doi/10.1016/j.compbiomed.2020.103738&rft_dat=%3Cproquest_cross%3E2425664167%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2425664167&rft_id=info:pmid/32421644&rft_els_id=S0010482520301190&rfr_iscdi=true