Deep learning-based apical lesion segmentation from panoramic radiographs

Convolutional neural networks (CNNs) have rapidly emerged as one of the most promising artificial intelligence methods in the field of medical and dental research. CNNs can provide an effective diagnostic methodology allowing for the detection of early-staged diseases. Therefore, this study aimed to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Imaging science in dentistry 2022-12, Vol.52 (4), p.351-357
Hauptverfasser: Song, Il-Seok, Shin, Hak-Kyun, Kang, Ju-Hee, Kim, Jo-Eun, Huh, Kyung-Hoe, Yi, Won-Jin, Lee, Sam-Sun, Heo, Min-Suk
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 357
container_issue 4
container_start_page 351
container_title Imaging science in dentistry
container_volume 52
creator Song, Il-Seok
Shin, Hak-Kyun
Kang, Ju-Hee
Kim, Jo-Eun
Huh, Kyung-Hoe
Yi, Won-Jin
Lee, Sam-Sun
Heo, Min-Suk
description Convolutional neural networks (CNNs) have rapidly emerged as one of the most promising artificial intelligence methods in the field of medical and dental research. CNNs can provide an effective diagnostic methodology allowing for the detection of early-staged diseases. Therefore, this study aimed to evaluate the performance of a deep CNN algorithm for apical lesion segmentation from panoramic radiographs. A total of 1000 panoramic images showing apical lesions were separated into training (n=800, 80%), validation (n=100, 10%), and test (n=100, 10%) datasets. The performance of identifying apical lesions was evaluated by calculating the precision, recall, and F1-score. In the test group of 180 apical lesions, 147 lesions were segmented from panoramic radiographs with an intersection over union (IoU) threshold of 0.3. The F1-score values, as a measure of performance, were 0.828, 0.815, and 0.742, respectively, with IoU thresholds of 0.3, 0.4, and 0.5. This study showed the potential utility of a deep learning-guided approach for the segmentation of apical lesions. The deep CNN algorithm using U-Net demonstrated considerably high performance in detecting apical lesions.
doi_str_mv 10.5624/isd.20220078
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9807797</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2761987206</sourcerecordid><originalsourceid>FETCH-LOGICAL-c450t-ad82151ca245d2cb0347e442ef904e53714f83c3b8fa4a6f815da06d217657f43</originalsourceid><addsrcrecordid>eNpVUU1Lw0AQXUSxpfbmWXL0YOpmv3MRpH4VCl70vEw2m3QlycbdVPDfm9JadC7z9XjzmIfQZYYXXBB262K5IJgQjKU6QVNCKE2lovj0WBMyQfMYP_AYnCgpsnM0oUJgrgSdotWDtX3SWAid6-q0gGjLBHpnoBmn0fkuibZubTfAsGuq4Nukh84HaJ1JApTO1wH6TbxAZxU00c4PeYbenx7fli_p-vV5tbxfp4ZxPKRQKpLxzABhvCSmwJRJyxixVY6Z5VRmrFLU0EJVwEBUKuMlYFGSTAouK0Zn6G7P22-L1pZmlBag0X1wLYRv7cHp_5vObXTtv3SusJS5HAmuDwTBf25tHHTrorFNA53126jJ-KNcSYLFCL3ZQ03wMQZbHc9kWO8M0KMB-teAEX71V9oR_Ptu-gOSrIIh</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2761987206</pqid></control><display><type>article</type><title>Deep learning-based apical lesion segmentation from panoramic radiographs</title><source>KoreaMed Synapse</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>PubMed Central Open Access</source><source>KoreaMed Open Access</source><source>PubMed Central</source><creator>Song, Il-Seok ; Shin, Hak-Kyun ; Kang, Ju-Hee ; Kim, Jo-Eun ; Huh, Kyung-Hoe ; Yi, Won-Jin ; Lee, Sam-Sun ; Heo, Min-Suk</creator><creatorcontrib>Song, Il-Seok ; Shin, Hak-Kyun ; Kang, Ju-Hee ; Kim, Jo-Eun ; Huh, Kyung-Hoe ; Yi, Won-Jin ; Lee, Sam-Sun ; Heo, Min-Suk</creatorcontrib><description>Convolutional neural networks (CNNs) have rapidly emerged as one of the most promising artificial intelligence methods in the field of medical and dental research. CNNs can provide an effective diagnostic methodology allowing for the detection of early-staged diseases. Therefore, this study aimed to evaluate the performance of a deep CNN algorithm for apical lesion segmentation from panoramic radiographs. A total of 1000 panoramic images showing apical lesions were separated into training (n=800, 80%), validation (n=100, 10%), and test (n=100, 10%) datasets. The performance of identifying apical lesions was evaluated by calculating the precision, recall, and F1-score. In the test group of 180 apical lesions, 147 lesions were segmented from panoramic radiographs with an intersection over union (IoU) threshold of 0.3. The F1-score values, as a measure of performance, were 0.828, 0.815, and 0.742, respectively, with IoU thresholds of 0.3, 0.4, and 0.5. This study showed the potential utility of a deep learning-guided approach for the segmentation of apical lesions. The deep CNN algorithm using U-Net demonstrated considerably high performance in detecting apical lesions.</description><identifier>ISSN: 2233-7822</identifier><identifier>EISSN: 2233-7830</identifier><identifier>DOI: 10.5624/isd.20220078</identifier><identifier>PMID: 36605863</identifier><language>eng</language><publisher>Korea (South): Korean Academy of Oral and Maxillofacial Radiology</publisher><subject>Original</subject><ispartof>Imaging science in dentistry, 2022-12, Vol.52 (4), p.351-357</ispartof><rights>Copyright © 2022 by Korean Academy of Oral and Maxillofacial Radiology.</rights><rights>Copyright © 2022 by Korean Academy of Oral and Maxillofacial Radiology 2022 Korean Academy of Oral and Maxillofacial Radiology</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c450t-ad82151ca245d2cb0347e442ef904e53714f83c3b8fa4a6f815da06d217657f43</citedby><cites>FETCH-LOGICAL-c450t-ad82151ca245d2cb0347e442ef904e53714f83c3b8fa4a6f815da06d217657f43</cites><orcidid>0000-0003-3344-4807 ; 0000-0003-3406-0645 ; 0000-0002-7973-0262 ; 0000-0002-5977-6634 ; 0000-0002-4407-5982 ; 0000-0003-0218-5304 ; 0000-0002-8771-0392 ; 0000-0001-7223-9262</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9807797/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9807797/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,315,728,781,785,865,886,27929,27930,53796,53798</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36605863$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Song, Il-Seok</creatorcontrib><creatorcontrib>Shin, Hak-Kyun</creatorcontrib><creatorcontrib>Kang, Ju-Hee</creatorcontrib><creatorcontrib>Kim, Jo-Eun</creatorcontrib><creatorcontrib>Huh, Kyung-Hoe</creatorcontrib><creatorcontrib>Yi, Won-Jin</creatorcontrib><creatorcontrib>Lee, Sam-Sun</creatorcontrib><creatorcontrib>Heo, Min-Suk</creatorcontrib><title>Deep learning-based apical lesion segmentation from panoramic radiographs</title><title>Imaging science in dentistry</title><addtitle>Imaging Sci Dent</addtitle><description>Convolutional neural networks (CNNs) have rapidly emerged as one of the most promising artificial intelligence methods in the field of medical and dental research. CNNs can provide an effective diagnostic methodology allowing for the detection of early-staged diseases. Therefore, this study aimed to evaluate the performance of a deep CNN algorithm for apical lesion segmentation from panoramic radiographs. A total of 1000 panoramic images showing apical lesions were separated into training (n=800, 80%), validation (n=100, 10%), and test (n=100, 10%) datasets. The performance of identifying apical lesions was evaluated by calculating the precision, recall, and F1-score. In the test group of 180 apical lesions, 147 lesions were segmented from panoramic radiographs with an intersection over union (IoU) threshold of 0.3. The F1-score values, as a measure of performance, were 0.828, 0.815, and 0.742, respectively, with IoU thresholds of 0.3, 0.4, and 0.5. This study showed the potential utility of a deep learning-guided approach for the segmentation of apical lesions. The deep CNN algorithm using U-Net demonstrated considerably high performance in detecting apical lesions.</description><subject>Original</subject><issn>2233-7822</issn><issn>2233-7830</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNpVUU1Lw0AQXUSxpfbmWXL0YOpmv3MRpH4VCl70vEw2m3QlycbdVPDfm9JadC7z9XjzmIfQZYYXXBB262K5IJgQjKU6QVNCKE2lovj0WBMyQfMYP_AYnCgpsnM0oUJgrgSdotWDtX3SWAid6-q0gGjLBHpnoBmn0fkuibZubTfAsGuq4Nukh84HaJ1JApTO1wH6TbxAZxU00c4PeYbenx7fli_p-vV5tbxfp4ZxPKRQKpLxzABhvCSmwJRJyxixVY6Z5VRmrFLU0EJVwEBUKuMlYFGSTAouK0Zn6G7P22-L1pZmlBag0X1wLYRv7cHp_5vObXTtv3SusJS5HAmuDwTBf25tHHTrorFNA53126jJ-KNcSYLFCL3ZQ03wMQZbHc9kWO8M0KMB-teAEX71V9oR_Ptu-gOSrIIh</recordid><startdate>20221201</startdate><enddate>20221201</enddate><creator>Song, Il-Seok</creator><creator>Shin, Hak-Kyun</creator><creator>Kang, Ju-Hee</creator><creator>Kim, Jo-Eun</creator><creator>Huh, Kyung-Hoe</creator><creator>Yi, Won-Jin</creator><creator>Lee, Sam-Sun</creator><creator>Heo, Min-Suk</creator><general>Korean Academy of Oral and Maxillofacial Radiology</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0003-3344-4807</orcidid><orcidid>https://orcid.org/0000-0003-3406-0645</orcidid><orcidid>https://orcid.org/0000-0002-7973-0262</orcidid><orcidid>https://orcid.org/0000-0002-5977-6634</orcidid><orcidid>https://orcid.org/0000-0002-4407-5982</orcidid><orcidid>https://orcid.org/0000-0003-0218-5304</orcidid><orcidid>https://orcid.org/0000-0002-8771-0392</orcidid><orcidid>https://orcid.org/0000-0001-7223-9262</orcidid></search><sort><creationdate>20221201</creationdate><title>Deep learning-based apical lesion segmentation from panoramic radiographs</title><author>Song, Il-Seok ; Shin, Hak-Kyun ; Kang, Ju-Hee ; Kim, Jo-Eun ; Huh, Kyung-Hoe ; Yi, Won-Jin ; Lee, Sam-Sun ; Heo, Min-Suk</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c450t-ad82151ca245d2cb0347e442ef904e53714f83c3b8fa4a6f815da06d217657f43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Original</topic><toplevel>online_resources</toplevel><creatorcontrib>Song, Il-Seok</creatorcontrib><creatorcontrib>Shin, Hak-Kyun</creatorcontrib><creatorcontrib>Kang, Ju-Hee</creatorcontrib><creatorcontrib>Kim, Jo-Eun</creatorcontrib><creatorcontrib>Huh, Kyung-Hoe</creatorcontrib><creatorcontrib>Yi, Won-Jin</creatorcontrib><creatorcontrib>Lee, Sam-Sun</creatorcontrib><creatorcontrib>Heo, Min-Suk</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Imaging science in dentistry</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Song, Il-Seok</au><au>Shin, Hak-Kyun</au><au>Kang, Ju-Hee</au><au>Kim, Jo-Eun</au><au>Huh, Kyung-Hoe</au><au>Yi, Won-Jin</au><au>Lee, Sam-Sun</au><au>Heo, Min-Suk</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Deep learning-based apical lesion segmentation from panoramic radiographs</atitle><jtitle>Imaging science in dentistry</jtitle><addtitle>Imaging Sci Dent</addtitle><date>2022-12-01</date><risdate>2022</risdate><volume>52</volume><issue>4</issue><spage>351</spage><epage>357</epage><pages>351-357</pages><issn>2233-7822</issn><eissn>2233-7830</eissn><abstract>Convolutional neural networks (CNNs) have rapidly emerged as one of the most promising artificial intelligence methods in the field of medical and dental research. CNNs can provide an effective diagnostic methodology allowing for the detection of early-staged diseases. Therefore, this study aimed to evaluate the performance of a deep CNN algorithm for apical lesion segmentation from panoramic radiographs. A total of 1000 panoramic images showing apical lesions were separated into training (n=800, 80%), validation (n=100, 10%), and test (n=100, 10%) datasets. The performance of identifying apical lesions was evaluated by calculating the precision, recall, and F1-score. In the test group of 180 apical lesions, 147 lesions were segmented from panoramic radiographs with an intersection over union (IoU) threshold of 0.3. The F1-score values, as a measure of performance, were 0.828, 0.815, and 0.742, respectively, with IoU thresholds of 0.3, 0.4, and 0.5. This study showed the potential utility of a deep learning-guided approach for the segmentation of apical lesions. The deep CNN algorithm using U-Net demonstrated considerably high performance in detecting apical lesions.</abstract><cop>Korea (South)</cop><pub>Korean Academy of Oral and Maxillofacial Radiology</pub><pmid>36605863</pmid><doi>10.5624/isd.20220078</doi><tpages>7</tpages><orcidid>https://orcid.org/0000-0003-3344-4807</orcidid><orcidid>https://orcid.org/0000-0003-3406-0645</orcidid><orcidid>https://orcid.org/0000-0002-7973-0262</orcidid><orcidid>https://orcid.org/0000-0002-5977-6634</orcidid><orcidid>https://orcid.org/0000-0002-4407-5982</orcidid><orcidid>https://orcid.org/0000-0003-0218-5304</orcidid><orcidid>https://orcid.org/0000-0002-8771-0392</orcidid><orcidid>https://orcid.org/0000-0001-7223-9262</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2233-7822
ispartof Imaging science in dentistry, 2022-12, Vol.52 (4), p.351-357
issn 2233-7822
2233-7830
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9807797
source KoreaMed Synapse; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; PubMed Central Open Access; KoreaMed Open Access; PubMed Central
subjects Original
title Deep learning-based apical lesion segmentation from panoramic radiographs
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-14T08%3A20%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Deep%20learning-based%20apical%20lesion%20segmentation%20from%20panoramic%20radiographs&rft.jtitle=Imaging%20science%20in%20dentistry&rft.au=Song,%20Il-Seok&rft.date=2022-12-01&rft.volume=52&rft.issue=4&rft.spage=351&rft.epage=357&rft.pages=351-357&rft.issn=2233-7822&rft.eissn=2233-7830&rft_id=info:doi/10.5624/isd.20220078&rft_dat=%3Cproquest_pubme%3E2761987206%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2761987206&rft_id=info:pmid/36605863&rfr_iscdi=true