Dermoscopic image segmentation based on Pyramid Residual Attention Module
We propose a stacked convolutional neural network incorporating a novel and efficient pyramid residual attention (PRA) module for the task of automatic segmentation of dermoscopic images. Precise segmentation is a significant and challenging step for computer-aided diagnosis technology in skin lesio...
Gespeichert in:
Veröffentlicht in: | PloS one 2022-09, Vol.17 (9), p.e0267380 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 9 |
container_start_page | e0267380 |
container_title | PloS one |
container_volume | 17 |
creator | Jiang, Yun Cheng, Tongtong Dong, Jinkun Liang, Jing Zhang, Yuan Lin, Xin Yao, Huixia |
description | We propose a stacked convolutional neural network incorporating a novel and efficient pyramid residual attention (PRA) module for the task of automatic segmentation of dermoscopic images. Precise segmentation is a significant and challenging step for computer-aided diagnosis technology in skin lesion diagnosis and treatment. The proposed PRA has the following characteristics: First, we concentrate on three widely used modules in the PRA. The purpose of the pyramid structure is to extract the feature information of the lesion area at different scales, the residual means is aimed to ensure the efficiency of model training, and the attention mechanism is used to screen effective features maps. Thanks to the PRA, our network can still obtain precise boundary information that distinguishes healthy skin from diseased areas for the blurred lesion areas. Secondly, the proposed PRA can increase the segmentation ability of a single module for lesion regions through efficient stacking. The third, we incorporate the idea of encoder-decoder into the architecture of the overall network. Compared with the traditional networks, we divide the segmentation procedure into three levels and construct the pyramid residual attention network (PRAN). The shallow layer mainly processes spatial information, the middle layer refines both spatial and semantic information, and the deep layer intensively learns semantic information. The basic module of PRAN is PRA, which is enough to ensure the efficiency of the three-layer architecture network. We extensively evaluate our method on ISIC2017 and ISIC2018 datasets. The experimental results demonstrate that PRAN can obtain better segmentation performance comparable to state-of-the-art deep learning models under the same experiment environment conditions. |
doi_str_mv | 10.1371/journal.pone.0267380 |
format | Article |
fullrecord | <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2715088590</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A717957912</galeid><doaj_id>oai_doaj_org_article_076f8ae868d14187aca82e7c9a9e7d68</doaj_id><sourcerecordid>A717957912</sourcerecordid><originalsourceid>FETCH-LOGICAL-c585t-44493a9b52760fcd1e8b5502bbdab29bd77f520721caa4cddc56ed6e278f22703</originalsourceid><addsrcrecordid>eNptkl2L1DAUhoso7rr6D0QLgngzY5I2H70RhvVrYEURvQ6nyWmnQ9qMSSvsvzd1usuMeJMckue857zJybLnlKxpIenbvZ_CAG598AOuCROyUORBdkmrgq0EI8XDk_giexLjnhBeKCEeZxeFoJSJsrrMtu8x9D4af-hM3vXQYh6x7XEYYez8kNcQ0eYp-HYboO9s_h1jZydw-WYcEzUzX7ydHD7NHjXgIj5b9qvs58cPP64_r26-ftpeb25Whis-rsqyrAqoas6kII2xFFXNOWF1baFmVW2lbDgjklEDUBprDRdoBTKpGsYkKa6yl0fdg_NRL68QNZOUE6V4NRPbI2E97PUhJFvhVnvo9N8DH1oNYeyMQ02kaBSgEsrSkioJBhRDaSqoUFqhkta7pdpU92hNshzAnYme3wzdTrf-t65KRUkhk8CbRSD4XxPGUfddNOgcDOinY99lOa8JffUP-n93C9VCMtANjU91zSyqN5LKisuKskS9PqF2CG7cRe-m-cPiOVgeQRN8jAGbe2-U6HnQ7prQ86DpZdBS2ovTd7lPupus4g_Gjc-9</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2715088590</pqid></control><display><type>article</type><title>Dermoscopic image segmentation based on Pyramid Residual Attention Module</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><source>Public Library of Science (PLoS)</source><creator>Jiang, Yun ; Cheng, Tongtong ; Dong, Jinkun ; Liang, Jing ; Zhang, Yuan ; Lin, Xin ; Yao, Huixia</creator><creatorcontrib>Jiang, Yun ; Cheng, Tongtong ; Dong, Jinkun ; Liang, Jing ; Zhang, Yuan ; Lin, Xin ; Yao, Huixia</creatorcontrib><description>We propose a stacked convolutional neural network incorporating a novel and efficient pyramid residual attention (PRA) module for the task of automatic segmentation of dermoscopic images. Precise segmentation is a significant and challenging step for computer-aided diagnosis technology in skin lesion diagnosis and treatment. The proposed PRA has the following characteristics: First, we concentrate on three widely used modules in the PRA. The purpose of the pyramid structure is to extract the feature information of the lesion area at different scales, the residual means is aimed to ensure the efficiency of model training, and the attention mechanism is used to screen effective features maps. Thanks to the PRA, our network can still obtain precise boundary information that distinguishes healthy skin from diseased areas for the blurred lesion areas. Secondly, the proposed PRA can increase the segmentation ability of a single module for lesion regions through efficient stacking. The third, we incorporate the idea of encoder-decoder into the architecture of the overall network. Compared with the traditional networks, we divide the segmentation procedure into three levels and construct the pyramid residual attention network (PRAN). The shallow layer mainly processes spatial information, the middle layer refines both spatial and semantic information, and the deep layer intensively learns semantic information. The basic module of PRAN is PRA, which is enough to ensure the efficiency of the three-layer architecture network. We extensively evaluate our method on ISIC2017 and ISIC2018 datasets. The experimental results demonstrate that PRAN can obtain better segmentation performance comparable to state-of-the-art deep learning models under the same experiment environment conditions.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0267380</identifier><identifier>PMID: 36112649</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Analysis ; Artificial neural networks ; Biology and Life Sciences ; Care and treatment ; Coders ; Computer and Information Sciences ; Datasets ; Deep learning ; Diagnosis ; Diagnosis, Computer-Assisted ; Disease Progression ; Encoders-Decoders ; Engineering and Technology ; Evaluation ; Feature extraction ; Humans ; Image processing ; Image segmentation ; Information processing ; Lesions ; Medical imaging equipment ; Medicine and Health Sciences ; Melanoma ; Modules ; Neural networks ; Neural Networks, Computer ; People and Places ; Pyramidal Tracts ; Research and Analysis Methods ; Semantics ; Skin cancer ; Skin Diseases ; Skin lesions ; Social Sciences ; Spatial data</subject><ispartof>PloS one, 2022-09, Vol.17 (9), p.e0267380</ispartof><rights>COPYRIGHT 2022 Public Library of Science</rights><rights>2022 Jiang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2022 Jiang et al 2022 Jiang et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c585t-44493a9b52760fcd1e8b5502bbdab29bd77f520721caa4cddc56ed6e278f22703</citedby><cites>FETCH-LOGICAL-c585t-44493a9b52760fcd1e8b5502bbdab29bd77f520721caa4cddc56ed6e278f22703</cites><orcidid>0000-0003-4971-0276</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9481037/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9481037/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,860,881,2096,2915,23845,27901,27902,53766,53768,79569,79570</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36112649$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Jiang, Yun</creatorcontrib><creatorcontrib>Cheng, Tongtong</creatorcontrib><creatorcontrib>Dong, Jinkun</creatorcontrib><creatorcontrib>Liang, Jing</creatorcontrib><creatorcontrib>Zhang, Yuan</creatorcontrib><creatorcontrib>Lin, Xin</creatorcontrib><creatorcontrib>Yao, Huixia</creatorcontrib><title>Dermoscopic image segmentation based on Pyramid Residual Attention Module</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>We propose a stacked convolutional neural network incorporating a novel and efficient pyramid residual attention (PRA) module for the task of automatic segmentation of dermoscopic images. Precise segmentation is a significant and challenging step for computer-aided diagnosis technology in skin lesion diagnosis and treatment. The proposed PRA has the following characteristics: First, we concentrate on three widely used modules in the PRA. The purpose of the pyramid structure is to extract the feature information of the lesion area at different scales, the residual means is aimed to ensure the efficiency of model training, and the attention mechanism is used to screen effective features maps. Thanks to the PRA, our network can still obtain precise boundary information that distinguishes healthy skin from diseased areas for the blurred lesion areas. Secondly, the proposed PRA can increase the segmentation ability of a single module for lesion regions through efficient stacking. The third, we incorporate the idea of encoder-decoder into the architecture of the overall network. Compared with the traditional networks, we divide the segmentation procedure into three levels and construct the pyramid residual attention network (PRAN). The shallow layer mainly processes spatial information, the middle layer refines both spatial and semantic information, and the deep layer intensively learns semantic information. The basic module of PRAN is PRA, which is enough to ensure the efficiency of the three-layer architecture network. We extensively evaluate our method on ISIC2017 and ISIC2018 datasets. The experimental results demonstrate that PRAN can obtain better segmentation performance comparable to state-of-the-art deep learning models under the same experiment environment conditions.</description><subject>Analysis</subject><subject>Artificial neural networks</subject><subject>Biology and Life Sciences</subject><subject>Care and treatment</subject><subject>Coders</subject><subject>Computer and Information Sciences</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Diagnosis</subject><subject>Diagnosis, Computer-Assisted</subject><subject>Disease Progression</subject><subject>Encoders-Decoders</subject><subject>Engineering and Technology</subject><subject>Evaluation</subject><subject>Feature extraction</subject><subject>Humans</subject><subject>Image processing</subject><subject>Image segmentation</subject><subject>Information processing</subject><subject>Lesions</subject><subject>Medical imaging equipment</subject><subject>Medicine and Health Sciences</subject><subject>Melanoma</subject><subject>Modules</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>People and Places</subject><subject>Pyramidal Tracts</subject><subject>Research and Analysis Methods</subject><subject>Semantics</subject><subject>Skin cancer</subject><subject>Skin Diseases</subject><subject>Skin lesions</subject><subject>Social Sciences</subject><subject>Spatial data</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>BENPR</sourceid><sourceid>DOA</sourceid><recordid>eNptkl2L1DAUhoso7rr6D0QLgngzY5I2H70RhvVrYEURvQ6nyWmnQ9qMSSvsvzd1usuMeJMckue857zJybLnlKxpIenbvZ_CAG598AOuCROyUORBdkmrgq0EI8XDk_giexLjnhBeKCEeZxeFoJSJsrrMtu8x9D4af-hM3vXQYh6x7XEYYez8kNcQ0eYp-HYboO9s_h1jZydw-WYcEzUzX7ydHD7NHjXgIj5b9qvs58cPP64_r26-ftpeb25Whis-rsqyrAqoas6kII2xFFXNOWF1baFmVW2lbDgjklEDUBprDRdoBTKpGsYkKa6yl0fdg_NRL68QNZOUE6V4NRPbI2E97PUhJFvhVnvo9N8DH1oNYeyMQ02kaBSgEsrSkioJBhRDaSqoUFqhkta7pdpU92hNshzAnYme3wzdTrf-t65KRUkhk8CbRSD4XxPGUfddNOgcDOinY99lOa8JffUP-n93C9VCMtANjU91zSyqN5LKisuKskS9PqF2CG7cRe-m-cPiOVgeQRN8jAGbe2-U6HnQ7prQ86DpZdBS2ovTd7lPupus4g_Gjc-9</recordid><startdate>20220916</startdate><enddate>20220916</enddate><creator>Jiang, Yun</creator><creator>Cheng, Tongtong</creator><creator>Dong, Jinkun</creator><creator>Liang, Jing</creator><creator>Zhang, Yuan</creator><creator>Lin, Xin</creator><creator>Yao, Huixia</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PIMPY</scope><scope>PJZUB</scope><scope>PKEHL</scope><scope>PPXIY</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-4971-0276</orcidid></search><sort><creationdate>20220916</creationdate><title>Dermoscopic image segmentation based on Pyramid Residual Attention Module</title><author>Jiang, Yun ; Cheng, Tongtong ; Dong, Jinkun ; Liang, Jing ; Zhang, Yuan ; Lin, Xin ; Yao, Huixia</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c585t-44493a9b52760fcd1e8b5502bbdab29bd77f520721caa4cddc56ed6e278f22703</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Analysis</topic><topic>Artificial neural networks</topic><topic>Biology and Life Sciences</topic><topic>Care and treatment</topic><topic>Coders</topic><topic>Computer and Information Sciences</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Diagnosis</topic><topic>Diagnosis, Computer-Assisted</topic><topic>Disease Progression</topic><topic>Encoders-Decoders</topic><topic>Engineering and Technology</topic><topic>Evaluation</topic><topic>Feature extraction</topic><topic>Humans</topic><topic>Image processing</topic><topic>Image segmentation</topic><topic>Information processing</topic><topic>Lesions</topic><topic>Medical imaging equipment</topic><topic>Medicine and Health Sciences</topic><topic>Melanoma</topic><topic>Modules</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>People and Places</topic><topic>Pyramidal Tracts</topic><topic>Research and Analysis Methods</topic><topic>Semantics</topic><topic>Skin cancer</topic><topic>Skin Diseases</topic><topic>Skin lesions</topic><topic>Social Sciences</topic><topic>Spatial data</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Jiang, Yun</creatorcontrib><creatorcontrib>Cheng, Tongtong</creatorcontrib><creatorcontrib>Dong, Jinkun</creatorcontrib><creatorcontrib>Liang, Jing</creatorcontrib><creatorcontrib>Zhang, Yuan</creatorcontrib><creatorcontrib>Lin, Xin</creatorcontrib><creatorcontrib>Yao, Huixia</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Nursing & Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological & Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>Agricultural & Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing & Allied Health Database (Alumni Edition)</collection><collection>Meteorological & Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Nursing & Allied Health Premium</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials Science Collection</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>Publicly Available Content Database</collection><collection>ProQuest Health & Medical Research Collection</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Health & Nursing</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied & Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Jiang, Yun</au><au>Cheng, Tongtong</au><au>Dong, Jinkun</au><au>Liang, Jing</au><au>Zhang, Yuan</au><au>Lin, Xin</au><au>Yao, Huixia</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Dermoscopic image segmentation based on Pyramid Residual Attention Module</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2022-09-16</date><risdate>2022</risdate><volume>17</volume><issue>9</issue><spage>e0267380</spage><pages>e0267380-</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>We propose a stacked convolutional neural network incorporating a novel and efficient pyramid residual attention (PRA) module for the task of automatic segmentation of dermoscopic images. Precise segmentation is a significant and challenging step for computer-aided diagnosis technology in skin lesion diagnosis and treatment. The proposed PRA has the following characteristics: First, we concentrate on three widely used modules in the PRA. The purpose of the pyramid structure is to extract the feature information of the lesion area at different scales, the residual means is aimed to ensure the efficiency of model training, and the attention mechanism is used to screen effective features maps. Thanks to the PRA, our network can still obtain precise boundary information that distinguishes healthy skin from diseased areas for the blurred lesion areas. Secondly, the proposed PRA can increase the segmentation ability of a single module for lesion regions through efficient stacking. The third, we incorporate the idea of encoder-decoder into the architecture of the overall network. Compared with the traditional networks, we divide the segmentation procedure into three levels and construct the pyramid residual attention network (PRAN). The shallow layer mainly processes spatial information, the middle layer refines both spatial and semantic information, and the deep layer intensively learns semantic information. The basic module of PRAN is PRA, which is enough to ensure the efficiency of the three-layer architecture network. We extensively evaluate our method on ISIC2017 and ISIC2018 datasets. The experimental results demonstrate that PRAN can obtain better segmentation performance comparable to state-of-the-art deep learning models under the same experiment environment conditions.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>36112649</pmid><doi>10.1371/journal.pone.0267380</doi><orcidid>https://orcid.org/0000-0003-4971-0276</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1932-6203 |
ispartof | PloS one, 2022-09, Vol.17 (9), p.e0267380 |
issn | 1932-6203 1932-6203 |
language | eng |
recordid | cdi_plos_journals_2715088590 |
source | MEDLINE; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; PubMed Central; Free Full-Text Journals in Chemistry; Public Library of Science (PLoS) |
subjects | Analysis Artificial neural networks Biology and Life Sciences Care and treatment Coders Computer and Information Sciences Datasets Deep learning Diagnosis Diagnosis, Computer-Assisted Disease Progression Encoders-Decoders Engineering and Technology Evaluation Feature extraction Humans Image processing Image segmentation Information processing Lesions Medical imaging equipment Medicine and Health Sciences Melanoma Modules Neural networks Neural Networks, Computer People and Places Pyramidal Tracts Research and Analysis Methods Semantics Skin cancer Skin Diseases Skin lesions Social Sciences Spatial data |
title | Dermoscopic image segmentation based on Pyramid Residual Attention Module |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-19T02%3A57%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Dermoscopic%20image%20segmentation%20based%20on%20Pyramid%20Residual%20Attention%20Module&rft.jtitle=PloS%20one&rft.au=Jiang,%20Yun&rft.date=2022-09-16&rft.volume=17&rft.issue=9&rft.spage=e0267380&rft.pages=e0267380-&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0267380&rft_dat=%3Cgale_plos_%3EA717957912%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2715088590&rft_id=info:pmid/36112649&rft_galeid=A717957912&rft_doaj_id=oai_doaj_org_article_076f8ae868d14187aca82e7c9a9e7d68&rfr_iscdi=true |