Deep learning neural network for texture feature extraction in oral cancer: enhanced loss function

The use of a binary classifier like the sigmoid function and loss functions reduces the accuracy of deep learning algorithms. This research aims to increase the accuracy of detecting and classifying oral tumours within a reduced processing time. The proposed system consists of a Convolutional neural...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2020-10, Vol.79 (37-38), p.27867-27890
Hauptverfasser: Bhandari, Bishal, Alsadoon, Abeer, Prasad, P. W. C., Abdullah, Salma, Haddad, Sami
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 27890
container_issue 37-38
container_start_page 27867
container_title Multimedia tools and applications
container_volume 79
creator Bhandari, Bishal
Alsadoon, Abeer
Prasad, P. W. C.
Abdullah, Salma
Haddad, Sami
description The use of a binary classifier like the sigmoid function and loss functions reduces the accuracy of deep learning algorithms. This research aims to increase the accuracy of detecting and classifying oral tumours within a reduced processing time. The proposed system consists of a Convolutional neural network with a modified loss function to minimise the error in predicting and classifying oral tumours by reducing the overfitting of the data and supporting multi-class classification. The proposed solution was tested on data samples from multiple datasets with four kinds of oral tumours. The averages of the different accuracy values and processing times were calculated to derive the overall accuracy. Based on the obtained results, the proposed solution achieved an overall accuracy of 96.5%, which was almost 2.0% higher than the state-of-the-art solution with 94.5% accuracy. Similarly, the processing time has been reduced by 30–40 milliseconds against the state-of-the-art solution. The proposed system is focused on detecting oral tumours in the given magnetic resonance imaging (MRI) scan and classifying whether the tumours are benign or malignant. This study solves the issue of over fitting data during the training of neural networks and provides a method for multi-class classification.
doi_str_mv 10.1007/s11042-020-09384-6
format Article
fullrecord <record><control><sourceid>proquest_webof</sourceid><recordid>TN_cdi_webofscience_primary_000559382300002CitationCount</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2449454983</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-8057f6cd97db3cbb28874dc5f1dce27933e44cf9897f1b55fb8ec2002513d5c33</originalsourceid><addsrcrecordid>eNqNkEtLxDAUhYso-PwDrgIupXrzmqTuZHyC4EbXoU1vxuqYjEmL-u9Np6I7cXVP4Ds395yiOKRwQgHUaaIUBCuBQQkV16KcbRQ7VCpeKsXoZtZcQ6kk0O1iN6VnADqTTOwUzQXiiiyxjr7zC-JxiPUyj_49xBfiQiQ9fvRDROKwXs_8jLXtu-BJ50kYcVt7i_GMoH8aVUuWISXiBr_G9ostVy8THnzPveLx6vJhflPe3V_fzs_vSstp1ZcapHIz21aqbbhtGqa1Eq2VjrYWmao4RyGsq3SlHG2kdI1GywCYpLyVlvO94mjau4rhbcDUm-cwRJ-_NEyISkhR6ZFiE2VjPjKiM6vYvdbx01AwY5dm6tLkLs26SzPLJj2Z3rEJLtkOc8wfIwBImUnGswI27_p6zD0Pg--z9fj_1kzziU6Z8AuMvxn-OO8LEBSY3w</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2449454983</pqid></control><display><type>article</type><title>Deep learning neural network for texture feature extraction in oral cancer: enhanced loss function</title><source>SpringerNature Journals</source><source>Web of Science - Science Citation Index Expanded - 2020&lt;img src="https://exlibris-pub.s3.amazonaws.com/fromwos-v2.jpg" /&gt;</source><creator>Bhandari, Bishal ; Alsadoon, Abeer ; Prasad, P. W. C. ; Abdullah, Salma ; Haddad, Sami</creator><creatorcontrib>Bhandari, Bishal ; Alsadoon, Abeer ; Prasad, P. W. C. ; Abdullah, Salma ; Haddad, Sami</creatorcontrib><description>The use of a binary classifier like the sigmoid function and loss functions reduces the accuracy of deep learning algorithms. This research aims to increase the accuracy of detecting and classifying oral tumours within a reduced processing time. The proposed system consists of a Convolutional neural network with a modified loss function to minimise the error in predicting and classifying oral tumours by reducing the overfitting of the data and supporting multi-class classification. The proposed solution was tested on data samples from multiple datasets with four kinds of oral tumours. The averages of the different accuracy values and processing times were calculated to derive the overall accuracy. Based on the obtained results, the proposed solution achieved an overall accuracy of 96.5%, which was almost 2.0% higher than the state-of-the-art solution with 94.5% accuracy. Similarly, the processing time has been reduced by 30–40 milliseconds against the state-of-the-art solution. The proposed system is focused on detecting oral tumours in the given magnetic resonance imaging (MRI) scan and classifying whether the tumours are benign or malignant. This study solves the issue of over fitting data during the training of neural networks and provides a method for multi-class classification.</description><identifier>ISSN: 1380-7501</identifier><identifier>EISSN: 1573-7721</identifier><identifier>DOI: 10.1007/s11042-020-09384-6</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Accuracy ; Algorithms ; Artificial neural networks ; Classification ; Computer Communication Networks ; Computer Science ; Computer Science, Information Systems ; Computer Science, Software Engineering ; Computer Science, Theory &amp; Methods ; Data Structures and Information Theory ; Deep learning ; Engineering ; Engineering, Electrical &amp; Electronic ; Feature extraction ; Machine learning ; Magnetic resonance imaging ; Multimedia Information Systems ; Neural networks ; Oral cancer ; Science &amp; Technology ; Special Purpose and Application-Based Systems ; Technology ; Tumors</subject><ispartof>Multimedia tools and applications, 2020-10, Vol.79 (37-38), p.27867-27890</ispartof><rights>Springer Science+Business Media, LLC, part of Springer Nature 2020</rights><rights>Springer Science+Business Media, LLC, part of Springer Nature 2020.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>true</woscitedreferencessubscribed><woscitedreferencescount>17</woscitedreferencescount><woscitedreferencesoriginalsourcerecordid>wos000559382300002</woscitedreferencesoriginalsourcerecordid><citedby>FETCH-LOGICAL-c319t-8057f6cd97db3cbb28874dc5f1dce27933e44cf9897f1b55fb8ec2002513d5c33</citedby><cites>FETCH-LOGICAL-c319t-8057f6cd97db3cbb28874dc5f1dce27933e44cf9897f1b55fb8ec2002513d5c33</cites><orcidid>0000-0002-2309-3540</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11042-020-09384-6$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11042-020-09384-6$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>315,782,786,27931,27932,28255,41495,42564,51326</link.rule.ids></links><search><creatorcontrib>Bhandari, Bishal</creatorcontrib><creatorcontrib>Alsadoon, Abeer</creatorcontrib><creatorcontrib>Prasad, P. W. C.</creatorcontrib><creatorcontrib>Abdullah, Salma</creatorcontrib><creatorcontrib>Haddad, Sami</creatorcontrib><title>Deep learning neural network for texture feature extraction in oral cancer: enhanced loss function</title><title>Multimedia tools and applications</title><addtitle>Multimed Tools Appl</addtitle><addtitle>MULTIMED TOOLS APPL</addtitle><description>The use of a binary classifier like the sigmoid function and loss functions reduces the accuracy of deep learning algorithms. This research aims to increase the accuracy of detecting and classifying oral tumours within a reduced processing time. The proposed system consists of a Convolutional neural network with a modified loss function to minimise the error in predicting and classifying oral tumours by reducing the overfitting of the data and supporting multi-class classification. The proposed solution was tested on data samples from multiple datasets with four kinds of oral tumours. The averages of the different accuracy values and processing times were calculated to derive the overall accuracy. Based on the obtained results, the proposed solution achieved an overall accuracy of 96.5%, which was almost 2.0% higher than the state-of-the-art solution with 94.5% accuracy. Similarly, the processing time has been reduced by 30–40 milliseconds against the state-of-the-art solution. The proposed system is focused on detecting oral tumours in the given magnetic resonance imaging (MRI) scan and classifying whether the tumours are benign or malignant. This study solves the issue of over fitting data during the training of neural networks and provides a method for multi-class classification.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Computer Communication Networks</subject><subject>Computer Science</subject><subject>Computer Science, Information Systems</subject><subject>Computer Science, Software Engineering</subject><subject>Computer Science, Theory &amp; Methods</subject><subject>Data Structures and Information Theory</subject><subject>Deep learning</subject><subject>Engineering</subject><subject>Engineering, Electrical &amp; Electronic</subject><subject>Feature extraction</subject><subject>Machine learning</subject><subject>Magnetic resonance imaging</subject><subject>Multimedia Information Systems</subject><subject>Neural networks</subject><subject>Oral cancer</subject><subject>Science &amp; Technology</subject><subject>Special Purpose and Application-Based Systems</subject><subject>Technology</subject><subject>Tumors</subject><issn>1380-7501</issn><issn>1573-7721</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>AOWDO</sourceid><sourceid>8G5</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNqNkEtLxDAUhYso-PwDrgIupXrzmqTuZHyC4EbXoU1vxuqYjEmL-u9Np6I7cXVP4Ds395yiOKRwQgHUaaIUBCuBQQkV16KcbRQ7VCpeKsXoZtZcQ6kk0O1iN6VnADqTTOwUzQXiiiyxjr7zC-JxiPUyj_49xBfiQiQ9fvRDROKwXs_8jLXtu-BJ50kYcVt7i_GMoH8aVUuWISXiBr_G9ostVy8THnzPveLx6vJhflPe3V_fzs_vSstp1ZcapHIz21aqbbhtGqa1Eq2VjrYWmao4RyGsq3SlHG2kdI1GywCYpLyVlvO94mjau4rhbcDUm-cwRJ-_NEyISkhR6ZFiE2VjPjKiM6vYvdbx01AwY5dm6tLkLs26SzPLJj2Z3rEJLtkOc8wfIwBImUnGswI27_p6zD0Pg--z9fj_1kzziU6Z8AuMvxn-OO8LEBSY3w</recordid><startdate>20201001</startdate><enddate>20201001</enddate><creator>Bhandari, Bishal</creator><creator>Alsadoon, Abeer</creator><creator>Prasad, P. W. C.</creator><creator>Abdullah, Salma</creator><creator>Haddad, Sami</creator><general>Springer US</general><general>Springer Nature</general><general>Springer Nature B.V</general><scope>AOWDO</scope><scope>BLEPL</scope><scope>DTL</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M2O</scope><scope>MBDVC</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-2309-3540</orcidid></search><sort><creationdate>20201001</creationdate><title>Deep learning neural network for texture feature extraction in oral cancer: enhanced loss function</title><author>Bhandari, Bishal ; Alsadoon, Abeer ; Prasad, P. W. C. ; Abdullah, Salma ; Haddad, Sami</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-8057f6cd97db3cbb28874dc5f1dce27933e44cf9897f1b55fb8ec2002513d5c33</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Computer Communication Networks</topic><topic>Computer Science</topic><topic>Computer Science, Information Systems</topic><topic>Computer Science, Software Engineering</topic><topic>Computer Science, Theory &amp; Methods</topic><topic>Data Structures and Information Theory</topic><topic>Deep learning</topic><topic>Engineering</topic><topic>Engineering, Electrical &amp; Electronic</topic><topic>Feature extraction</topic><topic>Machine learning</topic><topic>Magnetic resonance imaging</topic><topic>Multimedia Information Systems</topic><topic>Neural networks</topic><topic>Oral cancer</topic><topic>Science &amp; Technology</topic><topic>Special Purpose and Application-Based Systems</topic><topic>Technology</topic><topic>Tumors</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Bhandari, Bishal</creatorcontrib><creatorcontrib>Alsadoon, Abeer</creatorcontrib><creatorcontrib>Prasad, P. W. C.</creatorcontrib><creatorcontrib>Abdullah, Salma</creatorcontrib><creatorcontrib>Haddad, Sami</creatorcontrib><collection>Web of Science - Science Citation Index Expanded - 2020</collection><collection>Web of Science Core Collection</collection><collection>Science Citation Index Expanded</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Access via ABI/INFORM (ProQuest)</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Research Library</collection><collection>Research Library (Corporate)</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><jtitle>Multimedia tools and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bhandari, Bishal</au><au>Alsadoon, Abeer</au><au>Prasad, P. W. C.</au><au>Abdullah, Salma</au><au>Haddad, Sami</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Deep learning neural network for texture feature extraction in oral cancer: enhanced loss function</atitle><jtitle>Multimedia tools and applications</jtitle><stitle>Multimed Tools Appl</stitle><stitle>MULTIMED TOOLS APPL</stitle><date>2020-10-01</date><risdate>2020</risdate><volume>79</volume><issue>37-38</issue><spage>27867</spage><epage>27890</epage><pages>27867-27890</pages><issn>1380-7501</issn><eissn>1573-7721</eissn><abstract>The use of a binary classifier like the sigmoid function and loss functions reduces the accuracy of deep learning algorithms. This research aims to increase the accuracy of detecting and classifying oral tumours within a reduced processing time. The proposed system consists of a Convolutional neural network with a modified loss function to minimise the error in predicting and classifying oral tumours by reducing the overfitting of the data and supporting multi-class classification. The proposed solution was tested on data samples from multiple datasets with four kinds of oral tumours. The averages of the different accuracy values and processing times were calculated to derive the overall accuracy. Based on the obtained results, the proposed solution achieved an overall accuracy of 96.5%, which was almost 2.0% higher than the state-of-the-art solution with 94.5% accuracy. Similarly, the processing time has been reduced by 30–40 milliseconds against the state-of-the-art solution. The proposed system is focused on detecting oral tumours in the given magnetic resonance imaging (MRI) scan and classifying whether the tumours are benign or malignant. This study solves the issue of over fitting data during the training of neural networks and provides a method for multi-class classification.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s11042-020-09384-6</doi><tpages>24</tpages><orcidid>https://orcid.org/0000-0002-2309-3540</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1380-7501
ispartof Multimedia tools and applications, 2020-10, Vol.79 (37-38), p.27867-27890
issn 1380-7501
1573-7721
language eng
recordid cdi_webofscience_primary_000559382300002CitationCount
source SpringerNature Journals; Web of Science - Science Citation Index Expanded - 2020<img src="https://exlibris-pub.s3.amazonaws.com/fromwos-v2.jpg" />
subjects Accuracy
Algorithms
Artificial neural networks
Classification
Computer Communication Networks
Computer Science
Computer Science, Information Systems
Computer Science, Software Engineering
Computer Science, Theory & Methods
Data Structures and Information Theory
Deep learning
Engineering
Engineering, Electrical & Electronic
Feature extraction
Machine learning
Magnetic resonance imaging
Multimedia Information Systems
Neural networks
Oral cancer
Science & Technology
Special Purpose and Application-Based Systems
Technology
Tumors
title Deep learning neural network for texture feature extraction in oral cancer: enhanced loss function
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-04T02%3A32%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_webof&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Deep%20learning%20neural%20network%20for%20texture%20feature%20extraction%20in%20oral%20cancer:%20enhanced%20loss%20function&rft.jtitle=Multimedia%20tools%20and%20applications&rft.au=Bhandari,%20Bishal&rft.date=2020-10-01&rft.volume=79&rft.issue=37-38&rft.spage=27867&rft.epage=27890&rft.pages=27867-27890&rft.issn=1380-7501&rft.eissn=1573-7721&rft_id=info:doi/10.1007/s11042-020-09384-6&rft_dat=%3Cproquest_webof%3E2449454983%3C/proquest_webof%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2449454983&rft_id=info:pmid/&rfr_iscdi=true