Convolutional Neural Network Based on Regional Features and Dimension Matching for Skin Cancer Classification
Diagnosis at an early stage is clinically important for the cure of skin cancer. However, since some skin cancers have similar intuitive characteristics, and dermatologists rely on subjective experience to distinguish skin cancer types, the accuracy is often suboptimal. Recently, the introduction of...
Gespeichert in:
Veröffentlicht in: | IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences Communications and Computer Sciences, 2024/08/01, Vol.E107.A(8), pp.1319-1327 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1327 |
---|---|
container_issue | 8 |
container_start_page | 1319 |
container_title | IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences |
container_volume | E107.A |
creator | SHA, Zhichao MA, Ziji XIONG, Kunlai QIN, Liangcheng WANG, Xueying |
description | Diagnosis at an early stage is clinically important for the cure of skin cancer. However, since some skin cancers have similar intuitive characteristics, and dermatologists rely on subjective experience to distinguish skin cancer types, the accuracy is often suboptimal. Recently, the introduction of computer methods in the medical field has better assisted physicians to improve the recognition rate but some challenges still exist. In the face of massive dermoscopic image data, residual network (ResNet) is more suitable for learning feature relationships inside big data because of its deeper network depth. Aiming at the deficiency of ResNet, this paper proposes a multi-region feature extraction and raising dimension matching method, which further improves the utilization rate of medical image features. This method firstly extracted rich and diverse features from multiple regions of the feature map, avoiding the deficiency of traditional residual modules repeatedly extracting features in a few fixed regions. Then, the fused features are strengthened by up-dimensioning the branch path information and stacking it with the main path, which solves the problem that the information of two paths is not ideal after fusion due to different dimensionality. The proposed method is experimented on the International Skin Imaging Collaboration (ISIC) Archive dataset, which contains more than 40,000 images. The results of this work on this dataset and other datasets are evaluated to be improved over networks containing traditional residual modules and some popular networks. |
doi_str_mv | 10.1587/transfun.2023EAP1120 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3104751794</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3104751794</sourcerecordid><originalsourceid>FETCH-LOGICAL-c402t-18e251debb39d0773a0149c3795fff5518fded3755fb136967d252a9e0050b253</originalsourceid><addsrcrecordid>eNpNkEtPwzAQhC0EEuXxDzhY4hzw2nGdHEsoD4mXCpwtN1mXlNQG2wHx7ymUAqdZab-ZXQ0hB8COQBbqOAXjou3dEWdcjEd3AJxtkAGoXGYghNokA1bCMCskK7bJToxzxqDgkA_IovLuzXd9ar0zHb3BPnxLevfhmZ6YiA31jk5wtgLO0KQ-YKTGNfS0XaCLywW9Nql-at2MWh_o_XPraGVcjYFWnYmxtW1tvi7skS1ruoj7P7pLHs_GD9VFdnV7flmNrrI6ZzxlUCCX0OB0KsqGKSUMg7yshSqltVZKKGyDjVBS2imIYTlUDZfclMiYZFMuxS45XOW-BP_aY0x67vuw_D9qASxXElSZL6l8RdXBxxjQ6pfQLkz40MD0V7F6Xaz-V-zSNlnZ5jGZGf6aTEht3eGfaQxM6ZEu1sO_kF-4fjJBoxOf5EmLxA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3104751794</pqid></control><display><type>article</type><title>Convolutional Neural Network Based on Regional Features and Dimension Matching for Skin Cancer Classification</title><source>J-STAGE Free</source><creator>SHA, Zhichao ; MA, Ziji ; XIONG, Kunlai ; QIN, Liangcheng ; WANG, Xueying</creator><creatorcontrib>SHA, Zhichao ; MA, Ziji ; XIONG, Kunlai ; QIN, Liangcheng ; WANG, Xueying</creatorcontrib><description>Diagnosis at an early stage is clinically important for the cure of skin cancer. However, since some skin cancers have similar intuitive characteristics, and dermatologists rely on subjective experience to distinguish skin cancer types, the accuracy is often suboptimal. Recently, the introduction of computer methods in the medical field has better assisted physicians to improve the recognition rate but some challenges still exist. In the face of massive dermoscopic image data, residual network (ResNet) is more suitable for learning feature relationships inside big data because of its deeper network depth. Aiming at the deficiency of ResNet, this paper proposes a multi-region feature extraction and raising dimension matching method, which further improves the utilization rate of medical image features. This method firstly extracted rich and diverse features from multiple regions of the feature map, avoiding the deficiency of traditional residual modules repeatedly extracting features in a few fixed regions. Then, the fused features are strengthened by up-dimensioning the branch path information and stacking it with the main path, which solves the problem that the information of two paths is not ideal after fusion due to different dimensionality. The proposed method is experimented on the International Skin Imaging Collaboration (ISIC) Archive dataset, which contains more than 40,000 images. The results of this work on this dataset and other datasets are evaluated to be improved over networks containing traditional residual modules and some popular networks.</description><identifier>ISSN: 0916-8508</identifier><identifier>EISSN: 1745-1337</identifier><identifier>DOI: 10.1587/transfun.2023EAP1120</identifier><language>eng</language><publisher>Tokyo: The Institute of Electronics, Information and Communication Engineers</publisher><subject>Artificial neural networks ; Big Data ; Cancer ; convolutional neural network ; Datasets ; dimension and region feature matching ; Feature extraction ; Feature maps ; ISIC Archive ; Matching ; medical image classification ; Medical imaging ; Modules ; ResNet ; Skin cancer</subject><ispartof>IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, 2024/08/01, Vol.E107.A(8), pp.1319-1327</ispartof><rights>2024 The Institute of Electronics, Information and Communication Engineers</rights><rights>Copyright Japan Science and Technology Agency 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c402t-18e251debb39d0773a0149c3795fff5518fded3755fb136967d252a9e0050b253</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,1881,27922,27923</link.rule.ids></links><search><creatorcontrib>SHA, Zhichao</creatorcontrib><creatorcontrib>MA, Ziji</creatorcontrib><creatorcontrib>XIONG, Kunlai</creatorcontrib><creatorcontrib>QIN, Liangcheng</creatorcontrib><creatorcontrib>WANG, Xueying</creatorcontrib><title>Convolutional Neural Network Based on Regional Features and Dimension Matching for Skin Cancer Classification</title><title>IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences</title><addtitle>IEICE Trans. Fundamentals</addtitle><description>Diagnosis at an early stage is clinically important for the cure of skin cancer. However, since some skin cancers have similar intuitive characteristics, and dermatologists rely on subjective experience to distinguish skin cancer types, the accuracy is often suboptimal. Recently, the introduction of computer methods in the medical field has better assisted physicians to improve the recognition rate but some challenges still exist. In the face of massive dermoscopic image data, residual network (ResNet) is more suitable for learning feature relationships inside big data because of its deeper network depth. Aiming at the deficiency of ResNet, this paper proposes a multi-region feature extraction and raising dimension matching method, which further improves the utilization rate of medical image features. This method firstly extracted rich and diverse features from multiple regions of the feature map, avoiding the deficiency of traditional residual modules repeatedly extracting features in a few fixed regions. Then, the fused features are strengthened by up-dimensioning the branch path information and stacking it with the main path, which solves the problem that the information of two paths is not ideal after fusion due to different dimensionality. The proposed method is experimented on the International Skin Imaging Collaboration (ISIC) Archive dataset, which contains more than 40,000 images. The results of this work on this dataset and other datasets are evaluated to be improved over networks containing traditional residual modules and some popular networks.</description><subject>Artificial neural networks</subject><subject>Big Data</subject><subject>Cancer</subject><subject>convolutional neural network</subject><subject>Datasets</subject><subject>dimension and region feature matching</subject><subject>Feature extraction</subject><subject>Feature maps</subject><subject>ISIC Archive</subject><subject>Matching</subject><subject>medical image classification</subject><subject>Medical imaging</subject><subject>Modules</subject><subject>ResNet</subject><subject>Skin cancer</subject><issn>0916-8508</issn><issn>1745-1337</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNpNkEtPwzAQhC0EEuXxDzhY4hzw2nGdHEsoD4mXCpwtN1mXlNQG2wHx7ymUAqdZab-ZXQ0hB8COQBbqOAXjou3dEWdcjEd3AJxtkAGoXGYghNokA1bCMCskK7bJToxzxqDgkA_IovLuzXd9ar0zHb3BPnxLevfhmZ6YiA31jk5wtgLO0KQ-YKTGNfS0XaCLywW9Nql-at2MWh_o_XPraGVcjYFWnYmxtW1tvi7skS1ruoj7P7pLHs_GD9VFdnV7flmNrrI6ZzxlUCCX0OB0KsqGKSUMg7yshSqltVZKKGyDjVBS2imIYTlUDZfclMiYZFMuxS45XOW-BP_aY0x67vuw_D9qASxXElSZL6l8RdXBxxjQ6pfQLkz40MD0V7F6Xaz-V-zSNlnZ5jGZGf6aTEht3eGfaQxM6ZEu1sO_kF-4fjJBoxOf5EmLxA</recordid><startdate>20240801</startdate><enddate>20240801</enddate><creator>SHA, Zhichao</creator><creator>MA, Ziji</creator><creator>XIONG, Kunlai</creator><creator>QIN, Liangcheng</creator><creator>WANG, Xueying</creator><general>The Institute of Electronics, Information and Communication Engineers</general><general>Japan Science and Technology Agency</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20240801</creationdate><title>Convolutional Neural Network Based on Regional Features and Dimension Matching for Skin Cancer Classification</title><author>SHA, Zhichao ; MA, Ziji ; XIONG, Kunlai ; QIN, Liangcheng ; WANG, Xueying</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c402t-18e251debb39d0773a0149c3795fff5518fded3755fb136967d252a9e0050b253</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Artificial neural networks</topic><topic>Big Data</topic><topic>Cancer</topic><topic>convolutional neural network</topic><topic>Datasets</topic><topic>dimension and region feature matching</topic><topic>Feature extraction</topic><topic>Feature maps</topic><topic>ISIC Archive</topic><topic>Matching</topic><topic>medical image classification</topic><topic>Medical imaging</topic><topic>Modules</topic><topic>ResNet</topic><topic>Skin cancer</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>SHA, Zhichao</creatorcontrib><creatorcontrib>MA, Ziji</creatorcontrib><creatorcontrib>XIONG, Kunlai</creatorcontrib><creatorcontrib>QIN, Liangcheng</creatorcontrib><creatorcontrib>WANG, Xueying</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>SHA, Zhichao</au><au>MA, Ziji</au><au>XIONG, Kunlai</au><au>QIN, Liangcheng</au><au>WANG, Xueying</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Convolutional Neural Network Based on Regional Features and Dimension Matching for Skin Cancer Classification</atitle><jtitle>IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences</jtitle><addtitle>IEICE Trans. Fundamentals</addtitle><date>2024-08-01</date><risdate>2024</risdate><volume>E107.A</volume><issue>8</issue><spage>1319</spage><epage>1327</epage><pages>1319-1327</pages><artnum>2023EAP1120</artnum><issn>0916-8508</issn><eissn>1745-1337</eissn><abstract>Diagnosis at an early stage is clinically important for the cure of skin cancer. However, since some skin cancers have similar intuitive characteristics, and dermatologists rely on subjective experience to distinguish skin cancer types, the accuracy is often suboptimal. Recently, the introduction of computer methods in the medical field has better assisted physicians to improve the recognition rate but some challenges still exist. In the face of massive dermoscopic image data, residual network (ResNet) is more suitable for learning feature relationships inside big data because of its deeper network depth. Aiming at the deficiency of ResNet, this paper proposes a multi-region feature extraction and raising dimension matching method, which further improves the utilization rate of medical image features. This method firstly extracted rich and diverse features from multiple regions of the feature map, avoiding the deficiency of traditional residual modules repeatedly extracting features in a few fixed regions. Then, the fused features are strengthened by up-dimensioning the branch path information and stacking it with the main path, which solves the problem that the information of two paths is not ideal after fusion due to different dimensionality. The proposed method is experimented on the International Skin Imaging Collaboration (ISIC) Archive dataset, which contains more than 40,000 images. The results of this work on this dataset and other datasets are evaluated to be improved over networks containing traditional residual modules and some popular networks.</abstract><cop>Tokyo</cop><pub>The Institute of Electronics, Information and Communication Engineers</pub><doi>10.1587/transfun.2023EAP1120</doi><tpages>9</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0916-8508 |
ispartof | IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, 2024/08/01, Vol.E107.A(8), pp.1319-1327 |
issn | 0916-8508 1745-1337 |
language | eng |
recordid | cdi_proquest_journals_3104751794 |
source | J-STAGE Free |
subjects | Artificial neural networks Big Data Cancer convolutional neural network Datasets dimension and region feature matching Feature extraction Feature maps ISIC Archive Matching medical image classification Medical imaging Modules ResNet Skin cancer |
title | Convolutional Neural Network Based on Regional Features and Dimension Matching for Skin Cancer Classification |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-09T11%3A31%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Convolutional%20Neural%20Network%20Based%20on%20Regional%20Features%20and%20Dimension%20Matching%20for%20Skin%20Cancer%20Classification&rft.jtitle=IEICE%20Transactions%20on%20Fundamentals%20of%20Electronics,%20Communications%20and%20Computer%20Sciences&rft.au=SHA,%20Zhichao&rft.date=2024-08-01&rft.volume=E107.A&rft.issue=8&rft.spage=1319&rft.epage=1327&rft.pages=1319-1327&rft.artnum=2023EAP1120&rft.issn=0916-8508&rft.eissn=1745-1337&rft_id=info:doi/10.1587/transfun.2023EAP1120&rft_dat=%3Cproquest_cross%3E3104751794%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3104751794&rft_id=info:pmid/&rfr_iscdi=true |