Lesion-Decoupling-Based Segmentation With Large-Scale Colon and Esophageal Datasets for Early Cancer Diagnosis
Lesions of early cancers often show flat, small, and isochromatic characteristics in medical endoscopy images, which are difficult to be captured. By analyzing the differences between the internal and external features of the lesion area, we propose a lesion-decoupling-based segmentation (LDS) netwo...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2024-08, Vol.35 (8), p.11142-11156 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 11156 |
---|---|
container_issue | 8 |
container_start_page | 11142 |
container_title | IEEE transaction on neural networks and learning systems |
container_volume | 35 |
creator | Lin, Qing Tan, Weimin Cai, Shilun Yan, Bo Li, Jichun Zhong, Yunshi |
description | Lesions of early cancers often show flat, small, and isochromatic characteristics in medical endoscopy images, which are difficult to be captured. By analyzing the differences between the internal and external features of the lesion area, we propose a lesion-decoupling-based segmentation (LDS) network for assisting early cancer diagnosis. We introduce a plug-and-play module called self-sampling similar feature disentangling module (FDM) to obtain accurate lesion boundaries. Then, we propose a feature separation loss (FSL) function to separate pathological features from normal ones. Moreover, since physicians make diagnoses with multimodal data, we propose a multimodal cooperative segmentation network with two different modal images as input: white-light images (WLIs) and narrowband images (NBIs). Our FDM and FSL show a good performance for both single-modal and multimodal segmentations. Extensive experiments on five backbones prove that our FDM and FSL can be easily applied to different backbones for a significant lesion segmentation accuracy improvement, and the maximum increase of mean Intersection over Union (mIoU) is 4.58. For colonoscopy, we can achieve up to mIoU of 91.49 on our Dataset A and 84.41 on the three public datasets. For esophagoscopy, mIoU of 64.32 is best achieved on the WLI dataset and 66.31 on the NBI dataset. |
doi_str_mv | 10.1109/TNNLS.2023.3248804 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TNNLS_2023_3248804</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10063232</ieee_id><sourcerecordid>2798708103</sourcerecordid><originalsourceid>FETCH-LOGICAL-c275t-f49af7225678f45e3b93a34261eec67c48934937947c23a74d189adf81402703</originalsourceid><addsrcrecordid>eNpNkMtOwzAQRS0EoqjwAwghL9mk-NXYXkJbHlJUFq0Eu2hwJiEojYudLvr3BFoqZhYzmrn3Lg4hl5yNOGf2djmfZ4uRYEKOpFDGMHVEzgRPRSKkMceHXb8NyEWMn6yvlI1TZU_JQGomjJTsjLQZxtq3yRSd36ybuq2Se4hY0AVWK2w76Povfa27D5pBqDBZOGiQTnzTn6Et6Cz69QdUCA2dQtdbu0hLH-gMQrOlE2gdBjqtoWp9rOM5OSmhiXixn0OyfJgtJ09J9vL4PLnLEif0uEtKZaHUQoxTbUo1RvluJUglUo7oUu2UsVJZqa3STkjQquDGQlEarpjQTA7JzS52HfzXBmOXr-rosGmgRb-JudDWaGY4k71U7KQu-BgDlvk61CsI25yz_Id0_ks6_yGd70n3put9_uZ9hcXB8se1F1ztBDUi_ktkqRR9fwO-44GN</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2798708103</pqid></control><display><type>article</type><title>Lesion-Decoupling-Based Segmentation With Large-Scale Colon and Esophageal Datasets for Early Cancer Diagnosis</title><source>IEEE Electronic Library (IEL)</source><creator>Lin, Qing ; Tan, Weimin ; Cai, Shilun ; Yan, Bo ; Li, Jichun ; Zhong, Yunshi</creator><creatorcontrib>Lin, Qing ; Tan, Weimin ; Cai, Shilun ; Yan, Bo ; Li, Jichun ; Zhong, Yunshi</creatorcontrib><description>Lesions of early cancers often show flat, small, and isochromatic characteristics in medical endoscopy images, which are difficult to be captured. By analyzing the differences between the internal and external features of the lesion area, we propose a lesion-decoupling-based segmentation (LDS) network for assisting early cancer diagnosis. We introduce a plug-and-play module called self-sampling similar feature disentangling module (FDM) to obtain accurate lesion boundaries. Then, we propose a feature separation loss (FSL) function to separate pathological features from normal ones. Moreover, since physicians make diagnoses with multimodal data, we propose a multimodal cooperative segmentation network with two different modal images as input: white-light images (WLIs) and narrowband images (NBIs). Our FDM and FSL show a good performance for both single-modal and multimodal segmentations. Extensive experiments on five backbones prove that our FDM and FSL can be easily applied to different backbones for a significant lesion segmentation accuracy improvement, and the maximum increase of mean Intersection over Union (mIoU) is 4.58. For colonoscopy, we can achieve up to mIoU of 91.49 on our Dataset A and 84.41 on the three public datasets. For esophagoscopy, mIoU of 64.32 is best achieved on the WLI dataset and 66.31 on the NBI dataset.</description><identifier>ISSN: 2162-237X</identifier><identifier>ISSN: 2162-2388</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2023.3248804</identifier><identifier>PMID: 37028330</identifier><identifier>CODEN: ITNNAL</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Algorithms ; Cancer ; Colon - diagnostic imaging ; Colon - pathology ; Colonic Neoplasms - diagnosis ; Colonic Neoplasms - diagnostic imaging ; Colonoscopy ; Colonoscopy - methods ; Databases, Factual ; Dataset ; Early Detection of Cancer - methods ; Esophageal Neoplasms - diagnosis ; Esophageal Neoplasms - diagnostic imaging ; feature separation ; Hospitals ; Humans ; Image Interpretation, Computer-Assisted - methods ; Image Processing, Computer-Assisted - methods ; Image segmentation ; lesion-decoupling segmentation ; Lesions ; Medical diagnostic imaging ; Medical services ; multimodal ; Narrow Band Imaging - methods ; Neural Networks, Computer ; self-sampling similar feature disentangling</subject><ispartof>IEEE transaction on neural networks and learning systems, 2024-08, Vol.35 (8), p.11142-11156</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c275t-f49af7225678f45e3b93a34261eec67c48934937947c23a74d189adf81402703</cites><orcidid>0000-0002-3808-3492 ; 0000-0003-4906-8244 ; 0000-0003-0256-9682 ; 0000-0001-7677-4772</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10063232$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10063232$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37028330$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Lin, Qing</creatorcontrib><creatorcontrib>Tan, Weimin</creatorcontrib><creatorcontrib>Cai, Shilun</creatorcontrib><creatorcontrib>Yan, Bo</creatorcontrib><creatorcontrib>Li, Jichun</creatorcontrib><creatorcontrib>Zhong, Yunshi</creatorcontrib><title>Lesion-Decoupling-Based Segmentation With Large-Scale Colon and Esophageal Datasets for Early Cancer Diagnosis</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNNLS</addtitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><description>Lesions of early cancers often show flat, small, and isochromatic characteristics in medical endoscopy images, which are difficult to be captured. By analyzing the differences between the internal and external features of the lesion area, we propose a lesion-decoupling-based segmentation (LDS) network for assisting early cancer diagnosis. We introduce a plug-and-play module called self-sampling similar feature disentangling module (FDM) to obtain accurate lesion boundaries. Then, we propose a feature separation loss (FSL) function to separate pathological features from normal ones. Moreover, since physicians make diagnoses with multimodal data, we propose a multimodal cooperative segmentation network with two different modal images as input: white-light images (WLIs) and narrowband images (NBIs). Our FDM and FSL show a good performance for both single-modal and multimodal segmentations. Extensive experiments on five backbones prove that our FDM and FSL can be easily applied to different backbones for a significant lesion segmentation accuracy improvement, and the maximum increase of mean Intersection over Union (mIoU) is 4.58. For colonoscopy, we can achieve up to mIoU of 91.49 on our Dataset A and 84.41 on the three public datasets. For esophagoscopy, mIoU of 64.32 is best achieved on the WLI dataset and 66.31 on the NBI dataset.</description><subject>Algorithms</subject><subject>Cancer</subject><subject>Colon - diagnostic imaging</subject><subject>Colon - pathology</subject><subject>Colonic Neoplasms - diagnosis</subject><subject>Colonic Neoplasms - diagnostic imaging</subject><subject>Colonoscopy</subject><subject>Colonoscopy - methods</subject><subject>Databases, Factual</subject><subject>Dataset</subject><subject>Early Detection of Cancer - methods</subject><subject>Esophageal Neoplasms - diagnosis</subject><subject>Esophageal Neoplasms - diagnostic imaging</subject><subject>feature separation</subject><subject>Hospitals</subject><subject>Humans</subject><subject>Image Interpretation, Computer-Assisted - methods</subject><subject>Image Processing, Computer-Assisted - methods</subject><subject>Image segmentation</subject><subject>lesion-decoupling segmentation</subject><subject>Lesions</subject><subject>Medical diagnostic imaging</subject><subject>Medical services</subject><subject>multimodal</subject><subject>Narrow Band Imaging - methods</subject><subject>Neural Networks, Computer</subject><subject>self-sampling similar feature disentangling</subject><issn>2162-237X</issn><issn>2162-2388</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNpNkMtOwzAQRS0EoqjwAwghL9mk-NXYXkJbHlJUFq0Eu2hwJiEojYudLvr3BFoqZhYzmrn3Lg4hl5yNOGf2djmfZ4uRYEKOpFDGMHVEzgRPRSKkMceHXb8NyEWMn6yvlI1TZU_JQGomjJTsjLQZxtq3yRSd36ybuq2Se4hY0AVWK2w76Povfa27D5pBqDBZOGiQTnzTn6Et6Cz69QdUCA2dQtdbu0hLH-gMQrOlE2gdBjqtoWp9rOM5OSmhiXixn0OyfJgtJ09J9vL4PLnLEif0uEtKZaHUQoxTbUo1RvluJUglUo7oUu2UsVJZqa3STkjQquDGQlEarpjQTA7JzS52HfzXBmOXr-rosGmgRb-JudDWaGY4k71U7KQu-BgDlvk61CsI25yz_Id0_ks6_yGd70n3put9_uZ9hcXB8se1F1ztBDUi_ktkqRR9fwO-44GN</recordid><startdate>20240801</startdate><enddate>20240801</enddate><creator>Lin, Qing</creator><creator>Tan, Weimin</creator><creator>Cai, Shilun</creator><creator>Yan, Bo</creator><creator>Li, Jichun</creator><creator>Zhong, Yunshi</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-3808-3492</orcidid><orcidid>https://orcid.org/0000-0003-4906-8244</orcidid><orcidid>https://orcid.org/0000-0003-0256-9682</orcidid><orcidid>https://orcid.org/0000-0001-7677-4772</orcidid></search><sort><creationdate>20240801</creationdate><title>Lesion-Decoupling-Based Segmentation With Large-Scale Colon and Esophageal Datasets for Early Cancer Diagnosis</title><author>Lin, Qing ; Tan, Weimin ; Cai, Shilun ; Yan, Bo ; Li, Jichun ; Zhong, Yunshi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c275t-f49af7225678f45e3b93a34261eec67c48934937947c23a74d189adf81402703</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Cancer</topic><topic>Colon - diagnostic imaging</topic><topic>Colon - pathology</topic><topic>Colonic Neoplasms - diagnosis</topic><topic>Colonic Neoplasms - diagnostic imaging</topic><topic>Colonoscopy</topic><topic>Colonoscopy - methods</topic><topic>Databases, Factual</topic><topic>Dataset</topic><topic>Early Detection of Cancer - methods</topic><topic>Esophageal Neoplasms - diagnosis</topic><topic>Esophageal Neoplasms - diagnostic imaging</topic><topic>feature separation</topic><topic>Hospitals</topic><topic>Humans</topic><topic>Image Interpretation, Computer-Assisted - methods</topic><topic>Image Processing, Computer-Assisted - methods</topic><topic>Image segmentation</topic><topic>lesion-decoupling segmentation</topic><topic>Lesions</topic><topic>Medical diagnostic imaging</topic><topic>Medical services</topic><topic>multimodal</topic><topic>Narrow Band Imaging - methods</topic><topic>Neural Networks, Computer</topic><topic>self-sampling similar feature disentangling</topic><toplevel>online_resources</toplevel><creatorcontrib>Lin, Qing</creatorcontrib><creatorcontrib>Tan, Weimin</creatorcontrib><creatorcontrib>Cai, Shilun</creatorcontrib><creatorcontrib>Yan, Bo</creatorcontrib><creatorcontrib>Li, Jichun</creatorcontrib><creatorcontrib>Zhong, Yunshi</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Lin, Qing</au><au>Tan, Weimin</au><au>Cai, Shilun</au><au>Yan, Bo</au><au>Li, Jichun</au><au>Zhong, Yunshi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Lesion-Decoupling-Based Segmentation With Large-Scale Colon and Esophageal Datasets for Early Cancer Diagnosis</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNNLS</stitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><date>2024-08-01</date><risdate>2024</risdate><volume>35</volume><issue>8</issue><spage>11142</spage><epage>11156</epage><pages>11142-11156</pages><issn>2162-237X</issn><issn>2162-2388</issn><eissn>2162-2388</eissn><coden>ITNNAL</coden><abstract>Lesions of early cancers often show flat, small, and isochromatic characteristics in medical endoscopy images, which are difficult to be captured. By analyzing the differences between the internal and external features of the lesion area, we propose a lesion-decoupling-based segmentation (LDS) network for assisting early cancer diagnosis. We introduce a plug-and-play module called self-sampling similar feature disentangling module (FDM) to obtain accurate lesion boundaries. Then, we propose a feature separation loss (FSL) function to separate pathological features from normal ones. Moreover, since physicians make diagnoses with multimodal data, we propose a multimodal cooperative segmentation network with two different modal images as input: white-light images (WLIs) and narrowband images (NBIs). Our FDM and FSL show a good performance for both single-modal and multimodal segmentations. Extensive experiments on five backbones prove that our FDM and FSL can be easily applied to different backbones for a significant lesion segmentation accuracy improvement, and the maximum increase of mean Intersection over Union (mIoU) is 4.58. For colonoscopy, we can achieve up to mIoU of 91.49 on our Dataset A and 84.41 on the three public datasets. For esophagoscopy, mIoU of 64.32 is best achieved on the WLI dataset and 66.31 on the NBI dataset.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>37028330</pmid><doi>10.1109/TNNLS.2023.3248804</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0002-3808-3492</orcidid><orcidid>https://orcid.org/0000-0003-4906-8244</orcidid><orcidid>https://orcid.org/0000-0003-0256-9682</orcidid><orcidid>https://orcid.org/0000-0001-7677-4772</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2162-237X |
ispartof | IEEE transaction on neural networks and learning systems, 2024-08, Vol.35 (8), p.11142-11156 |
issn | 2162-237X 2162-2388 2162-2388 |
language | eng |
recordid | cdi_crossref_primary_10_1109_TNNLS_2023_3248804 |
source | IEEE Electronic Library (IEL) |
subjects | Algorithms Cancer Colon - diagnostic imaging Colon - pathology Colonic Neoplasms - diagnosis Colonic Neoplasms - diagnostic imaging Colonoscopy Colonoscopy - methods Databases, Factual Dataset Early Detection of Cancer - methods Esophageal Neoplasms - diagnosis Esophageal Neoplasms - diagnostic imaging feature separation Hospitals Humans Image Interpretation, Computer-Assisted - methods Image Processing, Computer-Assisted - methods Image segmentation lesion-decoupling segmentation Lesions Medical diagnostic imaging Medical services multimodal Narrow Band Imaging - methods Neural Networks, Computer self-sampling similar feature disentangling |
title | Lesion-Decoupling-Based Segmentation With Large-Scale Colon and Esophageal Datasets for Early Cancer Diagnosis |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T03%3A28%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Lesion-Decoupling-Based%20Segmentation%20With%20Large-Scale%20Colon%20and%20Esophageal%20Datasets%20for%20Early%20Cancer%20Diagnosis&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Lin,%20Qing&rft.date=2024-08-01&rft.volume=35&rft.issue=8&rft.spage=11142&rft.epage=11156&rft.pages=11142-11156&rft.issn=2162-237X&rft.eissn=2162-2388&rft.coden=ITNNAL&rft_id=info:doi/10.1109/TNNLS.2023.3248804&rft_dat=%3Cproquest_RIE%3E2798708103%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2798708103&rft_id=info:pmid/37028330&rft_ieee_id=10063232&rfr_iscdi=true |