Burns Depth Assessment Using Deep Learning Features
Purpose Burns depth evaluation is a lifesaving task and very challenging that requires objective techniques to accomplish. While the visual assessment is the most commonly used by surgeons, its accuracy reliability ranges between 60 and 80% and subjective that lacks any standard guideline. Currently...
Gespeichert in:
Veröffentlicht in: | Journal of medical and biological engineering 2020-12, Vol.40 (6), p.923-933 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 933 |
---|---|
container_issue | 6 |
container_start_page | 923 |
container_title | Journal of medical and biological engineering |
container_volume | 40 |
creator | Abubakar, Aliyu Ugail, Hassan Smith, Kirsty M. Bukar, Ali Maina Elmahmudi, Ali |
description | Purpose
Burns depth evaluation is a lifesaving task and very challenging that requires objective techniques to accomplish. While the visual assessment is the most commonly used by surgeons, its accuracy reliability ranges between 60 and 80% and subjective that lacks any standard guideline. Currently, the only standard adjunct to clinical evaluation of burn depth is Laser Doppler Imaging (LDI) which measures microcirculation within the dermal tissue, providing the burns potential healing time which correspond to the depth of the injury achieving up to 100% accuracy. However, the use of LDI is limited due to many factors including high affordability and diagnostic costs, its accuracy is affected by movement which makes it difficult to assess paediatric patients, high level of human expertise is required to operate the device, and 100% accuracy possible after 72 h. These shortfalls necessitate the need for objective and affordable technique.
Method
In this study, we leverage the use of deep transfer learning technique using two pretrained models ResNet50 and VGG16 for the extraction of image patterns (ResFeat50 and VggFeat16) from a a burn dataset of 2080 RGB images which composed of healthy skin, first degree, second degree and third-degree burns evenly distributed. We then use One-versus-One Support Vector Machines (SVM) for multi-class prediction and was trained using 10-folds cross validation to achieve optimum trade-off between bias and variance.
Results
The proposed approach yields maximum prediction accuracy of 95.43% using
ResFeat50
and 85.67% using
VggFeat16
. The average recall, precision and F1-score are 95.50%, 95.50%, 95.50% and 85.75%, 86.25%, 85.75% for both
ResFeat50
and
VggFeat16
respectively.
Conclusion
The proposed pipeline achieved a state-of-the-art prediction accuracy and interestingly indicates that decision can be made in less than a minute whether the injury requires surgical intervention such as skin grafting or not. |
doi_str_mv | 10.1007/s40846-020-00574-z |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2473790079</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2473790079</sourcerecordid><originalsourceid>FETCH-LOGICAL-c400t-76ac7c3df5f627ea587d3921f43fd2773a3ba7e587cb373838c76f22c4dea5c13</originalsourceid><addsrcrecordid>eNp9UE1PwzAMjRBITGN_gFMlzgHno3FzHIMB0iQu7BxlaTo2sa7E3YH9ejKKxA1fLD-992w_xq4F3AoAvCMNlTYcJHCAEjU_nrGRFNZyjSWes5EwYDnYqrxkE6It5FLWGFGNmLo_pJaKh9j178WUKBLtYtsXS9q06wzHrlhEn9rTNI--P6RIV-yi8R8UJ799zJbzx7fZM1-8Pr3MpgseNEDP0fiAQdVN2RiJ0ZcV1spK0WjV1BJRebXyGDMcVgpVpaqAppEy6DqTg1BjdjP4dmn_eYjUu-0-X5tXOqlRoc3P28ySAyukPVGKjevSZufTlxPgTvm4IR-X83E_-bhjFqlBRJncrmP6s_5H9Q1xcmdu</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2473790079</pqid></control><display><type>article</type><title>Burns Depth Assessment Using Deep Learning Features</title><source>SpringerNature Journals</source><creator>Abubakar, Aliyu ; Ugail, Hassan ; Smith, Kirsty M. ; Bukar, Ali Maina ; Elmahmudi, Ali</creator><creatorcontrib>Abubakar, Aliyu ; Ugail, Hassan ; Smith, Kirsty M. ; Bukar, Ali Maina ; Elmahmudi, Ali</creatorcontrib><description>Purpose
Burns depth evaluation is a lifesaving task and very challenging that requires objective techniques to accomplish. While the visual assessment is the most commonly used by surgeons, its accuracy reliability ranges between 60 and 80% and subjective that lacks any standard guideline. Currently, the only standard adjunct to clinical evaluation of burn depth is Laser Doppler Imaging (LDI) which measures microcirculation within the dermal tissue, providing the burns potential healing time which correspond to the depth of the injury achieving up to 100% accuracy. However, the use of LDI is limited due to many factors including high affordability and diagnostic costs, its accuracy is affected by movement which makes it difficult to assess paediatric patients, high level of human expertise is required to operate the device, and 100% accuracy possible after 72 h. These shortfalls necessitate the need for objective and affordable technique.
Method
In this study, we leverage the use of deep transfer learning technique using two pretrained models ResNet50 and VGG16 for the extraction of image patterns (ResFeat50 and VggFeat16) from a a burn dataset of 2080 RGB images which composed of healthy skin, first degree, second degree and third-degree burns evenly distributed. We then use One-versus-One Support Vector Machines (SVM) for multi-class prediction and was trained using 10-folds cross validation to achieve optimum trade-off between bias and variance.
Results
The proposed approach yields maximum prediction accuracy of 95.43% using
ResFeat50
and 85.67% using
VggFeat16
. The average recall, precision and F1-score are 95.50%, 95.50%, 95.50% and 85.75%, 86.25%, 85.75% for both
ResFeat50
and
VggFeat16
respectively.
Conclusion
The proposed pipeline achieved a state-of-the-art prediction accuracy and interestingly indicates that decision can be made in less than a minute whether the injury requires surgical intervention such as skin grafting or not.</description><identifier>ISSN: 1609-0985</identifier><identifier>EISSN: 2199-4757</identifier><identifier>DOI: 10.1007/s40846-020-00574-z</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Accuracy ; Biomedical Engineering and Bioengineering ; Burns ; Cell Biology ; Color imagery ; Deep learning ; Diagnostic systems ; Doppler effect ; Engineering ; Evaluation ; Imaging ; Medical imaging ; Original Article ; Predictions ; Radiology ; Reliability analysis ; Skin grafts ; Support vector machines ; Transfer learning</subject><ispartof>Journal of medical and biological engineering, 2020-12, Vol.40 (6), p.923-933</ispartof><rights>The Author(s) 2020</rights><rights>The Author(s) 2020. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c400t-76ac7c3df5f627ea587d3921f43fd2773a3ba7e587cb373838c76f22c4dea5c13</citedby><cites>FETCH-LOGICAL-c400t-76ac7c3df5f627ea587d3921f43fd2773a3ba7e587cb373838c76f22c4dea5c13</cites><orcidid>0000-0003-3628-7261</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s40846-020-00574-z$$EPDF$$P50$$Gspringer$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s40846-020-00574-z$$EHTML$$P50$$Gspringer$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Abubakar, Aliyu</creatorcontrib><creatorcontrib>Ugail, Hassan</creatorcontrib><creatorcontrib>Smith, Kirsty M.</creatorcontrib><creatorcontrib>Bukar, Ali Maina</creatorcontrib><creatorcontrib>Elmahmudi, Ali</creatorcontrib><title>Burns Depth Assessment Using Deep Learning Features</title><title>Journal of medical and biological engineering</title><addtitle>J. Med. Biol. Eng</addtitle><description>Purpose
Burns depth evaluation is a lifesaving task and very challenging that requires objective techniques to accomplish. While the visual assessment is the most commonly used by surgeons, its accuracy reliability ranges between 60 and 80% and subjective that lacks any standard guideline. Currently, the only standard adjunct to clinical evaluation of burn depth is Laser Doppler Imaging (LDI) which measures microcirculation within the dermal tissue, providing the burns potential healing time which correspond to the depth of the injury achieving up to 100% accuracy. However, the use of LDI is limited due to many factors including high affordability and diagnostic costs, its accuracy is affected by movement which makes it difficult to assess paediatric patients, high level of human expertise is required to operate the device, and 100% accuracy possible after 72 h. These shortfalls necessitate the need for objective and affordable technique.
Method
In this study, we leverage the use of deep transfer learning technique using two pretrained models ResNet50 and VGG16 for the extraction of image patterns (ResFeat50 and VggFeat16) from a a burn dataset of 2080 RGB images which composed of healthy skin, first degree, second degree and third-degree burns evenly distributed. We then use One-versus-One Support Vector Machines (SVM) for multi-class prediction and was trained using 10-folds cross validation to achieve optimum trade-off between bias and variance.
Results
The proposed approach yields maximum prediction accuracy of 95.43% using
ResFeat50
and 85.67% using
VggFeat16
. The average recall, precision and F1-score are 95.50%, 95.50%, 95.50% and 85.75%, 86.25%, 85.75% for both
ResFeat50
and
VggFeat16
respectively.
Conclusion
The proposed pipeline achieved a state-of-the-art prediction accuracy and interestingly indicates that decision can be made in less than a minute whether the injury requires surgical intervention such as skin grafting or not.</description><subject>Accuracy</subject><subject>Biomedical Engineering and Bioengineering</subject><subject>Burns</subject><subject>Cell Biology</subject><subject>Color imagery</subject><subject>Deep learning</subject><subject>Diagnostic systems</subject><subject>Doppler effect</subject><subject>Engineering</subject><subject>Evaluation</subject><subject>Imaging</subject><subject>Medical imaging</subject><subject>Original Article</subject><subject>Predictions</subject><subject>Radiology</subject><subject>Reliability analysis</subject><subject>Skin grafts</subject><subject>Support vector machines</subject><subject>Transfer learning</subject><issn>1609-0985</issn><issn>2199-4757</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>C6C</sourceid><recordid>eNp9UE1PwzAMjRBITGN_gFMlzgHno3FzHIMB0iQu7BxlaTo2sa7E3YH9ejKKxA1fLD-992w_xq4F3AoAvCMNlTYcJHCAEjU_nrGRFNZyjSWes5EwYDnYqrxkE6It5FLWGFGNmLo_pJaKh9j178WUKBLtYtsXS9q06wzHrlhEn9rTNI--P6RIV-yi8R8UJ799zJbzx7fZM1-8Pr3MpgseNEDP0fiAQdVN2RiJ0ZcV1spK0WjV1BJRebXyGDMcVgpVpaqAppEy6DqTg1BjdjP4dmn_eYjUu-0-X5tXOqlRoc3P28ySAyukPVGKjevSZufTlxPgTvm4IR-X83E_-bhjFqlBRJncrmP6s_5H9Q1xcmdu</recordid><startdate>20201201</startdate><enddate>20201201</enddate><creator>Abubakar, Aliyu</creator><creator>Ugail, Hassan</creator><creator>Smith, Kirsty M.</creator><creator>Bukar, Ali Maina</creator><creator>Elmahmudi, Ali</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>K9.</scope><orcidid>https://orcid.org/0000-0003-3628-7261</orcidid></search><sort><creationdate>20201201</creationdate><title>Burns Depth Assessment Using Deep Learning Features</title><author>Abubakar, Aliyu ; Ugail, Hassan ; Smith, Kirsty M. ; Bukar, Ali Maina ; Elmahmudi, Ali</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c400t-76ac7c3df5f627ea587d3921f43fd2773a3ba7e587cb373838c76f22c4dea5c13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Accuracy</topic><topic>Biomedical Engineering and Bioengineering</topic><topic>Burns</topic><topic>Cell Biology</topic><topic>Color imagery</topic><topic>Deep learning</topic><topic>Diagnostic systems</topic><topic>Doppler effect</topic><topic>Engineering</topic><topic>Evaluation</topic><topic>Imaging</topic><topic>Medical imaging</topic><topic>Original Article</topic><topic>Predictions</topic><topic>Radiology</topic><topic>Reliability analysis</topic><topic>Skin grafts</topic><topic>Support vector machines</topic><topic>Transfer learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Abubakar, Aliyu</creatorcontrib><creatorcontrib>Ugail, Hassan</creatorcontrib><creatorcontrib>Smith, Kirsty M.</creatorcontrib><creatorcontrib>Bukar, Ali Maina</creatorcontrib><creatorcontrib>Elmahmudi, Ali</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>CrossRef</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><jtitle>Journal of medical and biological engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Abubakar, Aliyu</au><au>Ugail, Hassan</au><au>Smith, Kirsty M.</au><au>Bukar, Ali Maina</au><au>Elmahmudi, Ali</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Burns Depth Assessment Using Deep Learning Features</atitle><jtitle>Journal of medical and biological engineering</jtitle><stitle>J. Med. Biol. Eng</stitle><date>2020-12-01</date><risdate>2020</risdate><volume>40</volume><issue>6</issue><spage>923</spage><epage>933</epage><pages>923-933</pages><issn>1609-0985</issn><eissn>2199-4757</eissn><abstract>Purpose
Burns depth evaluation is a lifesaving task and very challenging that requires objective techniques to accomplish. While the visual assessment is the most commonly used by surgeons, its accuracy reliability ranges between 60 and 80% and subjective that lacks any standard guideline. Currently, the only standard adjunct to clinical evaluation of burn depth is Laser Doppler Imaging (LDI) which measures microcirculation within the dermal tissue, providing the burns potential healing time which correspond to the depth of the injury achieving up to 100% accuracy. However, the use of LDI is limited due to many factors including high affordability and diagnostic costs, its accuracy is affected by movement which makes it difficult to assess paediatric patients, high level of human expertise is required to operate the device, and 100% accuracy possible after 72 h. These shortfalls necessitate the need for objective and affordable technique.
Method
In this study, we leverage the use of deep transfer learning technique using two pretrained models ResNet50 and VGG16 for the extraction of image patterns (ResFeat50 and VggFeat16) from a a burn dataset of 2080 RGB images which composed of healthy skin, first degree, second degree and third-degree burns evenly distributed. We then use One-versus-One Support Vector Machines (SVM) for multi-class prediction and was trained using 10-folds cross validation to achieve optimum trade-off between bias and variance.
Results
The proposed approach yields maximum prediction accuracy of 95.43% using
ResFeat50
and 85.67% using
VggFeat16
. The average recall, precision and F1-score are 95.50%, 95.50%, 95.50% and 85.75%, 86.25%, 85.75% for both
ResFeat50
and
VggFeat16
respectively.
Conclusion
The proposed pipeline achieved a state-of-the-art prediction accuracy and interestingly indicates that decision can be made in less than a minute whether the injury requires surgical intervention such as skin grafting or not.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s40846-020-00574-z</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0003-3628-7261</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1609-0985 |
ispartof | Journal of medical and biological engineering, 2020-12, Vol.40 (6), p.923-933 |
issn | 1609-0985 2199-4757 |
language | eng |
recordid | cdi_proquest_journals_2473790079 |
source | SpringerNature Journals |
subjects | Accuracy Biomedical Engineering and Bioengineering Burns Cell Biology Color imagery Deep learning Diagnostic systems Doppler effect Engineering Evaluation Imaging Medical imaging Original Article Predictions Radiology Reliability analysis Skin grafts Support vector machines Transfer learning |
title | Burns Depth Assessment Using Deep Learning Features |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-18T19%3A37%3A01IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Burns%20Depth%20Assessment%20Using%20Deep%20Learning%20Features&rft.jtitle=Journal%20of%20medical%20and%20biological%20engineering&rft.au=Abubakar,%20Aliyu&rft.date=2020-12-01&rft.volume=40&rft.issue=6&rft.spage=923&rft.epage=933&rft.pages=923-933&rft.issn=1609-0985&rft.eissn=2199-4757&rft_id=info:doi/10.1007/s40846-020-00574-z&rft_dat=%3Cproquest_cross%3E2473790079%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2473790079&rft_id=info:pmid/&rfr_iscdi=true |