Enhancing Colorectal Cancer Diagnosis With Feature Fusion and Convolutional Neural Networks

TumorDiagX is a cutting-edge framework that combines deep learning and computer vision to accurately identify and classify cancers. Our collection of colonoscopies 1518 images is meticulously pre-processed, including greyscale conversion and local binary pattern (LBP) extraction, before being secure...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of sensors 2024-11, Vol.2024
Hauptverfasser: Narasimha Raju, Akella S, Rajababu, M, Acharya, Ashish, Sajja Suneel
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title Journal of sensors
container_volume 2024
creator Narasimha Raju, Akella S
Rajababu, M
Acharya, Ashish
Sajja Suneel
description TumorDiagX is a cutting-edge framework that combines deep learning and computer vision to accurately identify and classify cancers. Our collection of colonoscopies 1518 images is meticulously pre-processed, including greyscale conversion and local binary pattern (LBP) extraction, before being securely stored on the Google Cloud platform. In the second phase, we fully assess three different convolutional neural networks (CNNs): residual network with 50 layers (ResNet-50), DenseNet-201 and visual geometry group with 16 layers (VGG-16). Stage three introduces four integrated CNNs (ResNet-50+DenseNet-201 (RD-22), DenseNet-201+VGG-16 (DV-22), ResNet-50+VGG-16 (RV-22), and ResNet-50+DenseNet-201=VGG-16 (RDV-22)) to improve cancer detection by combining the capabilities of several networks. Comprehensive analysis and training on the datasets provide significant insights into CNN’s performance. The fourth step involves an extensive comparison, integrating and comparing all three data sets using individual and integrated CNNs to determine the best effective models for cancer diagnosis. In this final step, image segmentation leverages an encoder–decoder network, namely a Universal Network (U-Net) CNN, to aid in the visual detection of malignant cancer lesions. The results highlight the effectiveness of TumorDiagX, with the feature fusion CNN using DenseNet-201 attaining training and testing accuracies of 97.27% and 97.35%. Notably, CNN (feature fusion) in combination with RDV-22 performs better, with training and testing accuracy of 98.47% and 97.93%, respectively, and a dice coefficient of 0.92. The information is privately maintained in the cloud and acts as an essential asset for healthcare practitioners, allowing for specific cancer prediction and prompt detection. Our method, with its meticulous performance metrics and multifaceted approach, has the potential to advance early cancer identification and treatment.
doi_str_mv 10.1155/2024/9916843
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3129229938</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3129229938</sourcerecordid><originalsourceid>FETCH-LOGICAL-g218t-3d57174b5682ca1072774fed329cac934221a261062425c9146ce5ce0e6fd42b3</originalsourceid><addsrcrecordid>eNo9jk1LAzEYhIMoWKs3f0DA89rkzdfmKGurQtGLouChpNnsduuSaJLVv29Q8fTMDMwwCJ1TckmpEAsgwBdaU1lzdoBmhapSIOvDfy1ejtFJSntCJFOMzdDr0u-Mt4PvcRPGEJ3NZsRNiVzE14PpfUhDws9D3uGVM3mKDq-mNASPjW9Lx3-GccrFl9q9m-IP8leIb-kUHXVmTO7sj3P0tFo-NrfV-uHmrrlaVz3QOlesFYoqvhWyBmsoUaAU71zLQFtjNeMA1ICkRAIHYTXl0jphHXGyazls2Rxd_O6-x_AxuZQ3-zDFcihtGAUNoDWr2Ter4lO_</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3129229938</pqid></control><display><type>article</type><title>Enhancing Colorectal Cancer Diagnosis With Feature Fusion and Convolutional Neural Networks</title><source>Wiley-Blackwell Open Access Titles</source><source>EZB-FREE-00999 freely available EZB journals</source><source>Alma/SFX Local Collection</source><creator>Narasimha Raju, Akella S ; Rajababu, M ; Acharya, Ashish ; Sajja Suneel</creator><contributor>Lei Chu</contributor><creatorcontrib>Narasimha Raju, Akella S ; Rajababu, M ; Acharya, Ashish ; Sajja Suneel ; Lei Chu</creatorcontrib><description>TumorDiagX is a cutting-edge framework that combines deep learning and computer vision to accurately identify and classify cancers. Our collection of colonoscopies 1518 images is meticulously pre-processed, including greyscale conversion and local binary pattern (LBP) extraction, before being securely stored on the Google Cloud platform. In the second phase, we fully assess three different convolutional neural networks (CNNs): residual network with 50 layers (ResNet-50), DenseNet-201 and visual geometry group with 16 layers (VGG-16). Stage three introduces four integrated CNNs (ResNet-50+DenseNet-201 (RD-22), DenseNet-201+VGG-16 (DV-22), ResNet-50+VGG-16 (RV-22), and ResNet-50+DenseNet-201=VGG-16 (RDV-22)) to improve cancer detection by combining the capabilities of several networks. Comprehensive analysis and training on the datasets provide significant insights into CNN’s performance. The fourth step involves an extensive comparison, integrating and comparing all three data sets using individual and integrated CNNs to determine the best effective models for cancer diagnosis. In this final step, image segmentation leverages an encoder–decoder network, namely a Universal Network (U-Net) CNN, to aid in the visual detection of malignant cancer lesions. The results highlight the effectiveness of TumorDiagX, with the feature fusion CNN using DenseNet-201 attaining training and testing accuracies of 97.27% and 97.35%. Notably, CNN (feature fusion) in combination with RDV-22 performs better, with training and testing accuracy of 98.47% and 97.93%, respectively, and a dice coefficient of 0.92. The information is privately maintained in the cloud and acts as an essential asset for healthcare practitioners, allowing for specific cancer prediction and prompt detection. Our method, with its meticulous performance metrics and multifaceted approach, has the potential to advance early cancer identification and treatment.</description><identifier>ISSN: 1687-725X</identifier><identifier>EISSN: 1687-7268</identifier><identifier>DOI: 10.1155/2024/9916843</identifier><language>eng</language><publisher>New York: Hindawi Limited</publisher><subject>Accuracy ; Algorithms ; Artificial intelligence ; Artificial neural networks ; Cancer ; Cancer therapies ; Colonoscopy ; Colorectal cancer ; Computer vision ; Datasets ; Deep learning ; Diagnosis ; Effectiveness ; Endoscopy ; Fatalities ; Image enhancement ; Image segmentation ; Machine learning ; Medical diagnosis ; Medical imaging ; Medical research ; Mortality ; Neural networks ; Performance measurement ; Polyps</subject><ispartof>Journal of sensors, 2024-11, Vol.2024</ispartof><rights>Copyright © 2024 Akella S. Narasimha Raju et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27923,27924</link.rule.ids></links><search><contributor>Lei Chu</contributor><creatorcontrib>Narasimha Raju, Akella S</creatorcontrib><creatorcontrib>Rajababu, M</creatorcontrib><creatorcontrib>Acharya, Ashish</creatorcontrib><creatorcontrib>Sajja Suneel</creatorcontrib><title>Enhancing Colorectal Cancer Diagnosis With Feature Fusion and Convolutional Neural Networks</title><title>Journal of sensors</title><description>TumorDiagX is a cutting-edge framework that combines deep learning and computer vision to accurately identify and classify cancers. Our collection of colonoscopies 1518 images is meticulously pre-processed, including greyscale conversion and local binary pattern (LBP) extraction, before being securely stored on the Google Cloud platform. In the second phase, we fully assess three different convolutional neural networks (CNNs): residual network with 50 layers (ResNet-50), DenseNet-201 and visual geometry group with 16 layers (VGG-16). Stage three introduces four integrated CNNs (ResNet-50+DenseNet-201 (RD-22), DenseNet-201+VGG-16 (DV-22), ResNet-50+VGG-16 (RV-22), and ResNet-50+DenseNet-201=VGG-16 (RDV-22)) to improve cancer detection by combining the capabilities of several networks. Comprehensive analysis and training on the datasets provide significant insights into CNN’s performance. The fourth step involves an extensive comparison, integrating and comparing all three data sets using individual and integrated CNNs to determine the best effective models for cancer diagnosis. In this final step, image segmentation leverages an encoder–decoder network, namely a Universal Network (U-Net) CNN, to aid in the visual detection of malignant cancer lesions. The results highlight the effectiveness of TumorDiagX, with the feature fusion CNN using DenseNet-201 attaining training and testing accuracies of 97.27% and 97.35%. Notably, CNN (feature fusion) in combination with RDV-22 performs better, with training and testing accuracy of 98.47% and 97.93%, respectively, and a dice coefficient of 0.92. The information is privately maintained in the cloud and acts as an essential asset for healthcare practitioners, allowing for specific cancer prediction and prompt detection. Our method, with its meticulous performance metrics and multifaceted approach, has the potential to advance early cancer identification and treatment.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Artificial intelligence</subject><subject>Artificial neural networks</subject><subject>Cancer</subject><subject>Cancer therapies</subject><subject>Colonoscopy</subject><subject>Colorectal cancer</subject><subject>Computer vision</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Diagnosis</subject><subject>Effectiveness</subject><subject>Endoscopy</subject><subject>Fatalities</subject><subject>Image enhancement</subject><subject>Image segmentation</subject><subject>Machine learning</subject><subject>Medical diagnosis</subject><subject>Medical imaging</subject><subject>Medical research</subject><subject>Mortality</subject><subject>Neural networks</subject><subject>Performance measurement</subject><subject>Polyps</subject><issn>1687-725X</issn><issn>1687-7268</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNo9jk1LAzEYhIMoWKs3f0DA89rkzdfmKGurQtGLouChpNnsduuSaJLVv29Q8fTMDMwwCJ1TckmpEAsgwBdaU1lzdoBmhapSIOvDfy1ejtFJSntCJFOMzdDr0u-Mt4PvcRPGEJ3NZsRNiVzE14PpfUhDws9D3uGVM3mKDq-mNASPjW9Lx3-GccrFl9q9m-IP8leIb-kUHXVmTO7sj3P0tFo-NrfV-uHmrrlaVz3QOlesFYoqvhWyBmsoUaAU71zLQFtjNeMA1ICkRAIHYTXl0jphHXGyazls2Rxd_O6-x_AxuZQ3-zDFcihtGAUNoDWr2Ter4lO_</recordid><startdate>20241108</startdate><enddate>20241108</enddate><creator>Narasimha Raju, Akella S</creator><creator>Rajababu, M</creator><creator>Acharya, Ashish</creator><creator>Sajja Suneel</creator><general>Hindawi Limited</general><scope>3V.</scope><scope>7SP</scope><scope>7U5</scope><scope>7XB</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>CWDGH</scope><scope>D1I</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>KB.</scope><scope>L6V</scope><scope>L7M</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>Q9U</scope></search><sort><creationdate>20241108</creationdate><title>Enhancing Colorectal Cancer Diagnosis With Feature Fusion and Convolutional Neural Networks</title><author>Narasimha Raju, Akella S ; Rajababu, M ; Acharya, Ashish ; Sajja Suneel</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-g218t-3d57174b5682ca1072774fed329cac934221a261062425c9146ce5ce0e6fd42b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Artificial intelligence</topic><topic>Artificial neural networks</topic><topic>Cancer</topic><topic>Cancer therapies</topic><topic>Colonoscopy</topic><topic>Colorectal cancer</topic><topic>Computer vision</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Diagnosis</topic><topic>Effectiveness</topic><topic>Endoscopy</topic><topic>Fatalities</topic><topic>Image enhancement</topic><topic>Image segmentation</topic><topic>Machine learning</topic><topic>Medical diagnosis</topic><topic>Medical imaging</topic><topic>Medical research</topic><topic>Mortality</topic><topic>Neural networks</topic><topic>Performance measurement</topic><topic>Polyps</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Narasimha Raju, Akella S</creatorcontrib><creatorcontrib>Rajababu, M</creatorcontrib><creatorcontrib>Acharya, Ashish</creatorcontrib><creatorcontrib>Sajja Suneel</creatorcontrib><collection>ProQuest Central (Corporate)</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>Middle East &amp; Africa Database</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Materials Science Database</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Materials Science Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Journal of sensors</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Narasimha Raju, Akella S</au><au>Rajababu, M</au><au>Acharya, Ashish</au><au>Sajja Suneel</au><au>Lei Chu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Enhancing Colorectal Cancer Diagnosis With Feature Fusion and Convolutional Neural Networks</atitle><jtitle>Journal of sensors</jtitle><date>2024-11-08</date><risdate>2024</risdate><volume>2024</volume><issn>1687-725X</issn><eissn>1687-7268</eissn><abstract>TumorDiagX is a cutting-edge framework that combines deep learning and computer vision to accurately identify and classify cancers. Our collection of colonoscopies 1518 images is meticulously pre-processed, including greyscale conversion and local binary pattern (LBP) extraction, before being securely stored on the Google Cloud platform. In the second phase, we fully assess three different convolutional neural networks (CNNs): residual network with 50 layers (ResNet-50), DenseNet-201 and visual geometry group with 16 layers (VGG-16). Stage three introduces four integrated CNNs (ResNet-50+DenseNet-201 (RD-22), DenseNet-201+VGG-16 (DV-22), ResNet-50+VGG-16 (RV-22), and ResNet-50+DenseNet-201=VGG-16 (RDV-22)) to improve cancer detection by combining the capabilities of several networks. Comprehensive analysis and training on the datasets provide significant insights into CNN’s performance. The fourth step involves an extensive comparison, integrating and comparing all three data sets using individual and integrated CNNs to determine the best effective models for cancer diagnosis. In this final step, image segmentation leverages an encoder–decoder network, namely a Universal Network (U-Net) CNN, to aid in the visual detection of malignant cancer lesions. The results highlight the effectiveness of TumorDiagX, with the feature fusion CNN using DenseNet-201 attaining training and testing accuracies of 97.27% and 97.35%. Notably, CNN (feature fusion) in combination with RDV-22 performs better, with training and testing accuracy of 98.47% and 97.93%, respectively, and a dice coefficient of 0.92. The information is privately maintained in the cloud and acts as an essential asset for healthcare practitioners, allowing for specific cancer prediction and prompt detection. Our method, with its meticulous performance metrics and multifaceted approach, has the potential to advance early cancer identification and treatment.</abstract><cop>New York</cop><pub>Hindawi Limited</pub><doi>10.1155/2024/9916843</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1687-725X
ispartof Journal of sensors, 2024-11, Vol.2024
issn 1687-725X
1687-7268
language eng
recordid cdi_proquest_journals_3129229938
source Wiley-Blackwell Open Access Titles; EZB-FREE-00999 freely available EZB journals; Alma/SFX Local Collection
subjects Accuracy
Algorithms
Artificial intelligence
Artificial neural networks
Cancer
Cancer therapies
Colonoscopy
Colorectal cancer
Computer vision
Datasets
Deep learning
Diagnosis
Effectiveness
Endoscopy
Fatalities
Image enhancement
Image segmentation
Machine learning
Medical diagnosis
Medical imaging
Medical research
Mortality
Neural networks
Performance measurement
Polyps
title Enhancing Colorectal Cancer Diagnosis With Feature Fusion and Convolutional Neural Networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T05%3A47%3A05IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Enhancing%20Colorectal%20Cancer%20Diagnosis%20With%20Feature%20Fusion%20and%20Convolutional%20Neural%20Networks&rft.jtitle=Journal%20of%20sensors&rft.au=Narasimha%20Raju,%20Akella%20S&rft.date=2024-11-08&rft.volume=2024&rft.issn=1687-725X&rft.eissn=1687-7268&rft_id=info:doi/10.1155/2024/9916843&rft_dat=%3Cproquest%3E3129229938%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3129229938&rft_id=info:pmid/&rfr_iscdi=true