STU3Net: An Improved U‐Net With Swin Transformer Fusion for Thyroid Nodule Segmentation
ABSTRACT Thyroid nodules are a common endocrine system disorder for which accurate ultrasound image segmentation is important for evaluation and diagnosis, as well as a critical step in computer‐aided diagnostic systems. However, the accuracy and consistency of segmentation remains a challenging tas...
Gespeichert in:
Veröffentlicht in: | International journal of imaging systems and technology 2024-09, Vol.34 (5), p.n/a |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | n/a |
---|---|
container_issue | 5 |
container_start_page | |
container_title | International journal of imaging systems and technology |
container_volume | 34 |
creator | Deng, Xiangyu Dang, Zhiyan Pan, Lihao |
description | ABSTRACT
Thyroid nodules are a common endocrine system disorder for which accurate ultrasound image segmentation is important for evaluation and diagnosis, as well as a critical step in computer‐aided diagnostic systems. However, the accuracy and consistency of segmentation remains a challenging task due to the presence of scattering noise, low contrast and resolution in ultrasound images. Therefore, we propose a deep learning‐based CAD (computer‐aided diagnosis) method, STU3Net in this paper, aiming at automatic segmentation of thyroid nodules. The method employs a modified Swin Transformer combined with a CNN encoder, which is capable of extracting morphological features and edge details of thyroid nodules in ultrasound images. In decoding through the features for image reconstruction, we introduce a modified three‐layer U‐Net network with cross‐layer connectivity to further enhance image reduction. This cross‐layer connectivity enhances the network's capture and representation of the contained image feature information by creating skip connections between different layers and merging the detailed information of the shallow network with the information of the deeper network. Through comparison experiments with current mainstream deep learning methods on the TN3K and BUSI datasets, we validate the superiority of the STU3Net method in thyroid nodule segmentation performance. The experimental results show that STU3Net outperforms most of the mainstream models on the TN3K dataset, with Dice and IoU reaching 0.8368 and 0.7416, respectively, which are significantly better than other methods. The method demonstrates excellent performance on these datasets and provides radiologists with an effective auxiliary tool to accurately detect thyroid nodules in ultrasound images. |
doi_str_mv | 10.1002/ima.23160 |
format | Article |
fullrecord | <record><control><sourceid>proquest_wiley</sourceid><recordid>TN_cdi_proquest_journals_3109648094</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3109648094</sourcerecordid><originalsourceid>FETCH-LOGICAL-p1530-54bcea377bcd98e4a0f1ae470b127485604661ef34adebdce0e32aa3f104d6c83</originalsourceid><addsrcrecordid>eNotUE1PwzAMjRBIjMGBfxCJczenSb-4TRODSWMc2gpxitLGZZ3WdqTtpt34CfxGfgnZxsX2s5-e7UfIPYMRA3DHZaVGLmc-XJABgyh0juGSDCCMIicSXnBNbtp2DcCYB96AfMRJypfYPdJJTefV1jQ71DT9_f6xTfpedisa78uaJkbVbdGYCg2d9W3Z1NQimqwOpik1XTa63yCN8bPCulOdnd-Sq0JtWrz7z0OSzp6S6YuzeHueTycLZ8s8Do4nshwVD4Is11GIQkHBFIoAMuYGIvR8EL7PsOBCacx0joDcVYoXDIT285APycNZ197-1WPbyXXTm9qulNw-74sQImFZ4zNrX27wILfGGmUOkoE82iYtkifb5Px1cir4H6fPYls</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3109648094</pqid></control><display><type>article</type><title>STU3Net: An Improved U‐Net With Swin Transformer Fusion for Thyroid Nodule Segmentation</title><source>Access via Wiley Online Library</source><creator>Deng, Xiangyu ; Dang, Zhiyan ; Pan, Lihao</creator><creatorcontrib>Deng, Xiangyu ; Dang, Zhiyan ; Pan, Lihao</creatorcontrib><description>ABSTRACT
Thyroid nodules are a common endocrine system disorder for which accurate ultrasound image segmentation is important for evaluation and diagnosis, as well as a critical step in computer‐aided diagnostic systems. However, the accuracy and consistency of segmentation remains a challenging task due to the presence of scattering noise, low contrast and resolution in ultrasound images. Therefore, we propose a deep learning‐based CAD (computer‐aided diagnosis) method, STU3Net in this paper, aiming at automatic segmentation of thyroid nodules. The method employs a modified Swin Transformer combined with a CNN encoder, which is capable of extracting morphological features and edge details of thyroid nodules in ultrasound images. In decoding through the features for image reconstruction, we introduce a modified three‐layer U‐Net network with cross‐layer connectivity to further enhance image reduction. This cross‐layer connectivity enhances the network's capture and representation of the contained image feature information by creating skip connections between different layers and merging the detailed information of the shallow network with the information of the deeper network. Through comparison experiments with current mainstream deep learning methods on the TN3K and BUSI datasets, we validate the superiority of the STU3Net method in thyroid nodule segmentation performance. The experimental results show that STU3Net outperforms most of the mainstream models on the TN3K dataset, with Dice and IoU reaching 0.8368 and 0.7416, respectively, which are significantly better than other methods. The method demonstrates excellent performance on these datasets and provides radiologists with an effective auxiliary tool to accurately detect thyroid nodules in ultrasound images.</description><identifier>ISSN: 0899-9457</identifier><identifier>EISSN: 1098-1098</identifier><identifier>DOI: 10.1002/ima.23160</identifier><language>eng</language><publisher>Hoboken, USA: John Wiley & Sons, Inc</publisher><subject>biomedical image segmentation ; Datasets ; Deep learning ; Diagnostic systems ; Endocrine system ; Image contrast ; Image enhancement ; Image reconstruction ; Image segmentation ; Machine learning ; Medical imaging ; Nodules ; Thyroid gland ; thyroid nodules ; transformer ; Transformers ; Ultrasonic imaging ; Ultrasonic testing ; U‐Net</subject><ispartof>International journal of imaging systems and technology, 2024-09, Vol.34 (5), p.n/a</ispartof><rights>2024 Wiley Periodicals LLC.</rights><rights>2024 Wiley Periodicals, LLC.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0009-0001-8582-4855 ; 0000-0002-5007-6797</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Fima.23160$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Fima.23160$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>315,781,785,1418,27929,27930,45579,45580</link.rule.ids></links><search><creatorcontrib>Deng, Xiangyu</creatorcontrib><creatorcontrib>Dang, Zhiyan</creatorcontrib><creatorcontrib>Pan, Lihao</creatorcontrib><title>STU3Net: An Improved U‐Net With Swin Transformer Fusion for Thyroid Nodule Segmentation</title><title>International journal of imaging systems and technology</title><description>ABSTRACT
Thyroid nodules are a common endocrine system disorder for which accurate ultrasound image segmentation is important for evaluation and diagnosis, as well as a critical step in computer‐aided diagnostic systems. However, the accuracy and consistency of segmentation remains a challenging task due to the presence of scattering noise, low contrast and resolution in ultrasound images. Therefore, we propose a deep learning‐based CAD (computer‐aided diagnosis) method, STU3Net in this paper, aiming at automatic segmentation of thyroid nodules. The method employs a modified Swin Transformer combined with a CNN encoder, which is capable of extracting morphological features and edge details of thyroid nodules in ultrasound images. In decoding through the features for image reconstruction, we introduce a modified three‐layer U‐Net network with cross‐layer connectivity to further enhance image reduction. This cross‐layer connectivity enhances the network's capture and representation of the contained image feature information by creating skip connections between different layers and merging the detailed information of the shallow network with the information of the deeper network. Through comparison experiments with current mainstream deep learning methods on the TN3K and BUSI datasets, we validate the superiority of the STU3Net method in thyroid nodule segmentation performance. The experimental results show that STU3Net outperforms most of the mainstream models on the TN3K dataset, with Dice and IoU reaching 0.8368 and 0.7416, respectively, which are significantly better than other methods. The method demonstrates excellent performance on these datasets and provides radiologists with an effective auxiliary tool to accurately detect thyroid nodules in ultrasound images.</description><subject>biomedical image segmentation</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Diagnostic systems</subject><subject>Endocrine system</subject><subject>Image contrast</subject><subject>Image enhancement</subject><subject>Image reconstruction</subject><subject>Image segmentation</subject><subject>Machine learning</subject><subject>Medical imaging</subject><subject>Nodules</subject><subject>Thyroid gland</subject><subject>thyroid nodules</subject><subject>transformer</subject><subject>Transformers</subject><subject>Ultrasonic imaging</subject><subject>Ultrasonic testing</subject><subject>U‐Net</subject><issn>0899-9457</issn><issn>1098-1098</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid/><recordid>eNotUE1PwzAMjRBIjMGBfxCJczenSb-4TRODSWMc2gpxitLGZZ3WdqTtpt34CfxGfgnZxsX2s5-e7UfIPYMRA3DHZaVGLmc-XJABgyh0juGSDCCMIicSXnBNbtp2DcCYB96AfMRJypfYPdJJTefV1jQ71DT9_f6xTfpedisa78uaJkbVbdGYCg2d9W3Z1NQimqwOpik1XTa63yCN8bPCulOdnd-Sq0JtWrz7z0OSzp6S6YuzeHueTycLZ8s8Do4nshwVD4Is11GIQkHBFIoAMuYGIvR8EL7PsOBCacx0joDcVYoXDIT285APycNZ197-1WPbyXXTm9qulNw-74sQImFZ4zNrX27wILfGGmUOkoE82iYtkifb5Px1cir4H6fPYls</recordid><startdate>202409</startdate><enddate>202409</enddate><creator>Deng, Xiangyu</creator><creator>Dang, Zhiyan</creator><creator>Pan, Lihao</creator><general>John Wiley & Sons, Inc</general><general>Wiley Subscription Services, Inc</general><scope/><orcidid>https://orcid.org/0009-0001-8582-4855</orcidid><orcidid>https://orcid.org/0000-0002-5007-6797</orcidid></search><sort><creationdate>202409</creationdate><title>STU3Net: An Improved U‐Net With Swin Transformer Fusion for Thyroid Nodule Segmentation</title><author>Deng, Xiangyu ; Dang, Zhiyan ; Pan, Lihao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-p1530-54bcea377bcd98e4a0f1ae470b127485604661ef34adebdce0e32aa3f104d6c83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>biomedical image segmentation</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Diagnostic systems</topic><topic>Endocrine system</topic><topic>Image contrast</topic><topic>Image enhancement</topic><topic>Image reconstruction</topic><topic>Image segmentation</topic><topic>Machine learning</topic><topic>Medical imaging</topic><topic>Nodules</topic><topic>Thyroid gland</topic><topic>thyroid nodules</topic><topic>transformer</topic><topic>Transformers</topic><topic>Ultrasonic imaging</topic><topic>Ultrasonic testing</topic><topic>U‐Net</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Deng, Xiangyu</creatorcontrib><creatorcontrib>Dang, Zhiyan</creatorcontrib><creatorcontrib>Pan, Lihao</creatorcontrib><jtitle>International journal of imaging systems and technology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Deng, Xiangyu</au><au>Dang, Zhiyan</au><au>Pan, Lihao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>STU3Net: An Improved U‐Net With Swin Transformer Fusion for Thyroid Nodule Segmentation</atitle><jtitle>International journal of imaging systems and technology</jtitle><date>2024-09</date><risdate>2024</risdate><volume>34</volume><issue>5</issue><epage>n/a</epage><issn>0899-9457</issn><eissn>1098-1098</eissn><abstract>ABSTRACT
Thyroid nodules are a common endocrine system disorder for which accurate ultrasound image segmentation is important for evaluation and diagnosis, as well as a critical step in computer‐aided diagnostic systems. However, the accuracy and consistency of segmentation remains a challenging task due to the presence of scattering noise, low contrast and resolution in ultrasound images. Therefore, we propose a deep learning‐based CAD (computer‐aided diagnosis) method, STU3Net in this paper, aiming at automatic segmentation of thyroid nodules. The method employs a modified Swin Transformer combined with a CNN encoder, which is capable of extracting morphological features and edge details of thyroid nodules in ultrasound images. In decoding through the features for image reconstruction, we introduce a modified three‐layer U‐Net network with cross‐layer connectivity to further enhance image reduction. This cross‐layer connectivity enhances the network's capture and representation of the contained image feature information by creating skip connections between different layers and merging the detailed information of the shallow network with the information of the deeper network. Through comparison experiments with current mainstream deep learning methods on the TN3K and BUSI datasets, we validate the superiority of the STU3Net method in thyroid nodule segmentation performance. The experimental results show that STU3Net outperforms most of the mainstream models on the TN3K dataset, with Dice and IoU reaching 0.8368 and 0.7416, respectively, which are significantly better than other methods. The method demonstrates excellent performance on these datasets and provides radiologists with an effective auxiliary tool to accurately detect thyroid nodules in ultrasound images.</abstract><cop>Hoboken, USA</cop><pub>John Wiley & Sons, Inc</pub><doi>10.1002/ima.23160</doi><tpages>14</tpages><orcidid>https://orcid.org/0009-0001-8582-4855</orcidid><orcidid>https://orcid.org/0000-0002-5007-6797</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0899-9457 |
ispartof | International journal of imaging systems and technology, 2024-09, Vol.34 (5), p.n/a |
issn | 0899-9457 1098-1098 |
language | eng |
recordid | cdi_proquest_journals_3109648094 |
source | Access via Wiley Online Library |
subjects | biomedical image segmentation Datasets Deep learning Diagnostic systems Endocrine system Image contrast Image enhancement Image reconstruction Image segmentation Machine learning Medical imaging Nodules Thyroid gland thyroid nodules transformer Transformers Ultrasonic imaging Ultrasonic testing U‐Net |
title | STU3Net: An Improved U‐Net With Swin Transformer Fusion for Thyroid Nodule Segmentation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-16T13%3A26%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_wiley&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=STU3Net:%20An%20Improved%20U%E2%80%90Net%20With%20Swin%20Transformer%20Fusion%20for%20Thyroid%20Nodule%20Segmentation&rft.jtitle=International%20journal%20of%20imaging%20systems%20and%20technology&rft.au=Deng,%20Xiangyu&rft.date=2024-09&rft.volume=34&rft.issue=5&rft.epage=n/a&rft.issn=0899-9457&rft.eissn=1098-1098&rft_id=info:doi/10.1002/ima.23160&rft_dat=%3Cproquest_wiley%3E3109648094%3C/proquest_wiley%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3109648094&rft_id=info:pmid/&rfr_iscdi=true |