Clinical Validation of a Deep-Learning Segmentation Software in Head and Neck: An Early Analysis in a Developing Radiation Oncology Center
Background: Organs at risk (OARs) delineation is a crucial step of radiotherapy (RT) treatment planning workflow. Time-consuming and inter-observer variability are main issues in manual OAR delineation, mainly in the head and neck (H & N) district. Deep-learning based auto-segmentation is a prom...
Gespeichert in:
Veröffentlicht in: | International journal of environmental research and public health 2022-07, Vol.19 (15), p.9057 |
---|---|
Hauptverfasser: | , , , , , , , , , , , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 15 |
container_start_page | 9057 |
container_title | International journal of environmental research and public health |
container_volume | 19 |
creator | D’Aviero, Andrea Re, Alessia Catucci, Francesco Piccari, Danila Votta, Claudio Piro, Domenico Piras, Antonio Di Dio, Carmela Iezzi, Martina Preziosi, Francesco Menna, Sebastiano Quaranta, Flaviovincenzo Boschetti, Althea Marras, Marco Miccichè, Francesco Gallus, Roberto Indovina, Luca Bussu, Francesco Valentini, Vincenzo Cusumano, Davide Mattiucci, Gian Carlo |
description | Background: Organs at risk (OARs) delineation is a crucial step of radiotherapy (RT) treatment planning workflow. Time-consuming and inter-observer variability are main issues in manual OAR delineation, mainly in the head and neck (H & N) district. Deep-learning based auto-segmentation is a promising strategy to improve OARs contouring in radiotherapy departments. A comparison of deep-learning-generated auto-contours (AC) with manual contours (MC) was performed by three expert radiation oncologists from a single center. Methods: Planning computed tomography (CT) scans of patients undergoing RT treatments for H&N cancers were considered. CT scans were processed by Limbus Contour auto-segmentation software, a commercial deep-learning auto-segmentation based software to generate AC. H&N protocol was used to perform AC, with the structure set consisting of bilateral brachial plexus, brain, brainstem, bilateral cochlea, pharyngeal constrictors, eye globes, bilateral lens, mandible, optic chiasm, bilateral optic nerves, oral cavity, bilateral parotids, spinal cord, bilateral submandibular glands, lips and thyroid. Manual revision of OARs was performed according to international consensus guidelines. The AC and MC were compared using the Dice similarity coefficient (DSC) and 95% Hausdorff distance transform (DT). Results: A total of 274 contours obtained by processing CT scans were included in the analysis. The highest values of DSC were obtained for the brain (DSC 1.00), left and right eye globes and the mandible (DSC 0.98). The structures with greater MC editing were optic chiasm, optic nerves and cochleae. Conclusions: In this preliminary analysis, deep-learning auto-segmentation seems to provide acceptable H&N OAR delineations. For less accurate organs, AC could be considered a starting point for review and manual adjustment. Our results suggest that AC could become a useful time-saving tool to optimize workload and resources in RT departments. |
doi_str_mv | 10.3390/ijerph19159057 |
format | Article |
fullrecord | <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9329735</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2700607542</sourcerecordid><originalsourceid>FETCH-LOGICAL-c395t-b40347b09d676333500b786deec3562e98266efa8711a7b6c802542afb2dae8e3</originalsourceid><addsrcrecordid>eNpdkcFu1DAQhi0EoqVw5WyJC5cUJ47tmANStRSKtGqlFrhaE2ey9eK1g50t2lfgqevVVoj2NL80n74ZzRDytmannGv2wa0xTbe1roVmQj0jx7WUrGolq5__l4_Iq5zXjPGulfolOeKi06ptxDH5u_AuOAue_gTvBphdDDSOFOhnxKlaIqTgwore4GqDYT70b-I4_4GE1AV6gTBQCAO9RPvrIz0L9ByS35UAfpdd3jN72R36OO1N1zC4g-Yq2OjjakcXxYzpNXkxgs_45qGekB9fzr8vLqrl1ddvi7NlZbkWc9W3jLeqZ3qQSnLOBWO96uSAaLmQDequkRJH6FRdg-ql7Vgj2gbGvhkAO-Qn5NPBO237DQ62DE_gzZTcBtLORHDmcSe4W7OKd0bzRisuiuD9gyDF31vMs9m4bNF7CBi32TRSS8Zk2-mCvnuCruM2ldMUShWGqbJaoU4PlE0x54Tjv2VqZvZvNo_fzO8B8AObaw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2700607542</pqid></control><display><type>article</type><title>Clinical Validation of a Deep-Learning Segmentation Software in Head and Neck: An Early Analysis in a Developing Radiation Oncology Center</title><source>MDPI - Multidisciplinary Digital Publishing Institute</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><source>PubMed Central Open Access</source><creator>D’Aviero, Andrea ; Re, Alessia ; Catucci, Francesco ; Piccari, Danila ; Votta, Claudio ; Piro, Domenico ; Piras, Antonio ; Di Dio, Carmela ; Iezzi, Martina ; Preziosi, Francesco ; Menna, Sebastiano ; Quaranta, Flaviovincenzo ; Boschetti, Althea ; Marras, Marco ; Miccichè, Francesco ; Gallus, Roberto ; Indovina, Luca ; Bussu, Francesco ; Valentini, Vincenzo ; Cusumano, Davide ; Mattiucci, Gian Carlo</creator><creatorcontrib>D’Aviero, Andrea ; Re, Alessia ; Catucci, Francesco ; Piccari, Danila ; Votta, Claudio ; Piro, Domenico ; Piras, Antonio ; Di Dio, Carmela ; Iezzi, Martina ; Preziosi, Francesco ; Menna, Sebastiano ; Quaranta, Flaviovincenzo ; Boschetti, Althea ; Marras, Marco ; Miccichè, Francesco ; Gallus, Roberto ; Indovina, Luca ; Bussu, Francesco ; Valentini, Vincenzo ; Cusumano, Davide ; Mattiucci, Gian Carlo</creatorcontrib><description>Background: Organs at risk (OARs) delineation is a crucial step of radiotherapy (RT) treatment planning workflow. Time-consuming and inter-observer variability are main issues in manual OAR delineation, mainly in the head and neck (H & N) district. Deep-learning based auto-segmentation is a promising strategy to improve OARs contouring in radiotherapy departments. A comparison of deep-learning-generated auto-contours (AC) with manual contours (MC) was performed by three expert radiation oncologists from a single center. Methods: Planning computed tomography (CT) scans of patients undergoing RT treatments for H&N cancers were considered. CT scans were processed by Limbus Contour auto-segmentation software, a commercial deep-learning auto-segmentation based software to generate AC. H&N protocol was used to perform AC, with the structure set consisting of bilateral brachial plexus, brain, brainstem, bilateral cochlea, pharyngeal constrictors, eye globes, bilateral lens, mandible, optic chiasm, bilateral optic nerves, oral cavity, bilateral parotids, spinal cord, bilateral submandibular glands, lips and thyroid. Manual revision of OARs was performed according to international consensus guidelines. The AC and MC were compared using the Dice similarity coefficient (DSC) and 95% Hausdorff distance transform (DT). Results: A total of 274 contours obtained by processing CT scans were included in the analysis. The highest values of DSC were obtained for the brain (DSC 1.00), left and right eye globes and the mandible (DSC 0.98). The structures with greater MC editing were optic chiasm, optic nerves and cochleae. Conclusions: In this preliminary analysis, deep-learning auto-segmentation seems to provide acceptable H&N OAR delineations. For less accurate organs, AC could be considered a starting point for review and manual adjustment. Our results suggest that AC could become a useful time-saving tool to optimize workload and resources in RT departments.</description><identifier>ISSN: 1660-4601</identifier><identifier>ISSN: 1661-7827</identifier><identifier>EISSN: 1660-4601</identifier><identifier>DOI: 10.3390/ijerph19159057</identifier><identifier>PMID: 35897425</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Brain stem ; Cancer therapies ; Cochlea ; Computed tomography ; Constrictors ; Contouring ; Contours ; Deep learning ; Delineation ; Exports ; Eye ; Eye lens ; Head & neck cancer ; Learning ; Metric space ; Nerves ; Oncology ; Optic chiasm ; Oral cavity ; Radiation ; Radiation therapy ; Segmentation ; Software ; Thyroid ; Tumors ; Work stations ; Workflow</subject><ispartof>International journal of environmental research and public health, 2022-07, Vol.19 (15), p.9057</ispartof><rights>2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2022 by the authors. 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c395t-b40347b09d676333500b786deec3562e98266efa8711a7b6c802542afb2dae8e3</citedby><cites>FETCH-LOGICAL-c395t-b40347b09d676333500b786deec3562e98266efa8711a7b6c802542afb2dae8e3</cites><orcidid>0000-0003-2619-4918 ; 0000-0001-6780-6621 ; 0000-0003-3343-2806 ; 0000-0001-6261-2772 ; 0000-0001-8073-9005 ; 0000-0003-0556-3593</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9329735/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9329735/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,881,27901,27902,53766,53768</link.rule.ids></links><search><creatorcontrib>D’Aviero, Andrea</creatorcontrib><creatorcontrib>Re, Alessia</creatorcontrib><creatorcontrib>Catucci, Francesco</creatorcontrib><creatorcontrib>Piccari, Danila</creatorcontrib><creatorcontrib>Votta, Claudio</creatorcontrib><creatorcontrib>Piro, Domenico</creatorcontrib><creatorcontrib>Piras, Antonio</creatorcontrib><creatorcontrib>Di Dio, Carmela</creatorcontrib><creatorcontrib>Iezzi, Martina</creatorcontrib><creatorcontrib>Preziosi, Francesco</creatorcontrib><creatorcontrib>Menna, Sebastiano</creatorcontrib><creatorcontrib>Quaranta, Flaviovincenzo</creatorcontrib><creatorcontrib>Boschetti, Althea</creatorcontrib><creatorcontrib>Marras, Marco</creatorcontrib><creatorcontrib>Miccichè, Francesco</creatorcontrib><creatorcontrib>Gallus, Roberto</creatorcontrib><creatorcontrib>Indovina, Luca</creatorcontrib><creatorcontrib>Bussu, Francesco</creatorcontrib><creatorcontrib>Valentini, Vincenzo</creatorcontrib><creatorcontrib>Cusumano, Davide</creatorcontrib><creatorcontrib>Mattiucci, Gian Carlo</creatorcontrib><title>Clinical Validation of a Deep-Learning Segmentation Software in Head and Neck: An Early Analysis in a Developing Radiation Oncology Center</title><title>International journal of environmental research and public health</title><description>Background: Organs at risk (OARs) delineation is a crucial step of radiotherapy (RT) treatment planning workflow. Time-consuming and inter-observer variability are main issues in manual OAR delineation, mainly in the head and neck (H & N) district. Deep-learning based auto-segmentation is a promising strategy to improve OARs contouring in radiotherapy departments. A comparison of deep-learning-generated auto-contours (AC) with manual contours (MC) was performed by three expert radiation oncologists from a single center. Methods: Planning computed tomography (CT) scans of patients undergoing RT treatments for H&N cancers were considered. CT scans were processed by Limbus Contour auto-segmentation software, a commercial deep-learning auto-segmentation based software to generate AC. H&N protocol was used to perform AC, with the structure set consisting of bilateral brachial plexus, brain, brainstem, bilateral cochlea, pharyngeal constrictors, eye globes, bilateral lens, mandible, optic chiasm, bilateral optic nerves, oral cavity, bilateral parotids, spinal cord, bilateral submandibular glands, lips and thyroid. Manual revision of OARs was performed according to international consensus guidelines. The AC and MC were compared using the Dice similarity coefficient (DSC) and 95% Hausdorff distance transform (DT). Results: A total of 274 contours obtained by processing CT scans were included in the analysis. The highest values of DSC were obtained for the brain (DSC 1.00), left and right eye globes and the mandible (DSC 0.98). The structures with greater MC editing were optic chiasm, optic nerves and cochleae. Conclusions: In this preliminary analysis, deep-learning auto-segmentation seems to provide acceptable H&N OAR delineations. For less accurate organs, AC could be considered a starting point for review and manual adjustment. Our results suggest that AC could become a useful time-saving tool to optimize workload and resources in RT departments.</description><subject>Brain stem</subject><subject>Cancer therapies</subject><subject>Cochlea</subject><subject>Computed tomography</subject><subject>Constrictors</subject><subject>Contouring</subject><subject>Contours</subject><subject>Deep learning</subject><subject>Delineation</subject><subject>Exports</subject><subject>Eye</subject><subject>Eye lens</subject><subject>Head & neck cancer</subject><subject>Learning</subject><subject>Metric space</subject><subject>Nerves</subject><subject>Oncology</subject><subject>Optic chiasm</subject><subject>Oral cavity</subject><subject>Radiation</subject><subject>Radiation therapy</subject><subject>Segmentation</subject><subject>Software</subject><subject>Thyroid</subject><subject>Tumors</subject><subject>Work stations</subject><subject>Workflow</subject><issn>1660-4601</issn><issn>1661-7827</issn><issn>1660-4601</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNpdkcFu1DAQhi0EoqVw5WyJC5cUJ47tmANStRSKtGqlFrhaE2ey9eK1g50t2lfgqevVVoj2NL80n74ZzRDytmannGv2wa0xTbe1roVmQj0jx7WUrGolq5__l4_Iq5zXjPGulfolOeKi06ptxDH5u_AuOAue_gTvBphdDDSOFOhnxKlaIqTgwore4GqDYT70b-I4_4GE1AV6gTBQCAO9RPvrIz0L9ByS35UAfpdd3jN72R36OO1N1zC4g-Yq2OjjakcXxYzpNXkxgs_45qGekB9fzr8vLqrl1ddvi7NlZbkWc9W3jLeqZ3qQSnLOBWO96uSAaLmQDequkRJH6FRdg-ql7Vgj2gbGvhkAO-Qn5NPBO237DQ62DE_gzZTcBtLORHDmcSe4W7OKd0bzRisuiuD9gyDF31vMs9m4bNF7CBi32TRSS8Zk2-mCvnuCruM2ldMUShWGqbJaoU4PlE0x54Tjv2VqZvZvNo_fzO8B8AObaw</recordid><startdate>20220725</startdate><enddate>20220725</enddate><creator>D’Aviero, Andrea</creator><creator>Re, Alessia</creator><creator>Catucci, Francesco</creator><creator>Piccari, Danila</creator><creator>Votta, Claudio</creator><creator>Piro, Domenico</creator><creator>Piras, Antonio</creator><creator>Di Dio, Carmela</creator><creator>Iezzi, Martina</creator><creator>Preziosi, Francesco</creator><creator>Menna, Sebastiano</creator><creator>Quaranta, Flaviovincenzo</creator><creator>Boschetti, Althea</creator><creator>Marras, Marco</creator><creator>Miccichè, Francesco</creator><creator>Gallus, Roberto</creator><creator>Indovina, Luca</creator><creator>Bussu, Francesco</creator><creator>Valentini, Vincenzo</creator><creator>Cusumano, Davide</creator><creator>Mattiucci, Gian Carlo</creator><general>MDPI AG</general><general>MDPI</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8C1</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0003-2619-4918</orcidid><orcidid>https://orcid.org/0000-0001-6780-6621</orcidid><orcidid>https://orcid.org/0000-0003-3343-2806</orcidid><orcidid>https://orcid.org/0000-0001-6261-2772</orcidid><orcidid>https://orcid.org/0000-0001-8073-9005</orcidid><orcidid>https://orcid.org/0000-0003-0556-3593</orcidid></search><sort><creationdate>20220725</creationdate><title>Clinical Validation of a Deep-Learning Segmentation Software in Head and Neck: An Early Analysis in a Developing Radiation Oncology Center</title><author>D’Aviero, Andrea ; Re, Alessia ; Catucci, Francesco ; Piccari, Danila ; Votta, Claudio ; Piro, Domenico ; Piras, Antonio ; Di Dio, Carmela ; Iezzi, Martina ; Preziosi, Francesco ; Menna, Sebastiano ; Quaranta, Flaviovincenzo ; Boschetti, Althea ; Marras, Marco ; Miccichè, Francesco ; Gallus, Roberto ; Indovina, Luca ; Bussu, Francesco ; Valentini, Vincenzo ; Cusumano, Davide ; Mattiucci, Gian Carlo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c395t-b40347b09d676333500b786deec3562e98266efa8711a7b6c802542afb2dae8e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Brain stem</topic><topic>Cancer therapies</topic><topic>Cochlea</topic><topic>Computed tomography</topic><topic>Constrictors</topic><topic>Contouring</topic><topic>Contours</topic><topic>Deep learning</topic><topic>Delineation</topic><topic>Exports</topic><topic>Eye</topic><topic>Eye lens</topic><topic>Head & neck cancer</topic><topic>Learning</topic><topic>Metric space</topic><topic>Nerves</topic><topic>Oncology</topic><topic>Optic chiasm</topic><topic>Oral cavity</topic><topic>Radiation</topic><topic>Radiation therapy</topic><topic>Segmentation</topic><topic>Software</topic><topic>Thyroid</topic><topic>Tumors</topic><topic>Work stations</topic><topic>Workflow</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>D’Aviero, Andrea</creatorcontrib><creatorcontrib>Re, Alessia</creatorcontrib><creatorcontrib>Catucci, Francesco</creatorcontrib><creatorcontrib>Piccari, Danila</creatorcontrib><creatorcontrib>Votta, Claudio</creatorcontrib><creatorcontrib>Piro, Domenico</creatorcontrib><creatorcontrib>Piras, Antonio</creatorcontrib><creatorcontrib>Di Dio, Carmela</creatorcontrib><creatorcontrib>Iezzi, Martina</creatorcontrib><creatorcontrib>Preziosi, Francesco</creatorcontrib><creatorcontrib>Menna, Sebastiano</creatorcontrib><creatorcontrib>Quaranta, Flaviovincenzo</creatorcontrib><creatorcontrib>Boschetti, Althea</creatorcontrib><creatorcontrib>Marras, Marco</creatorcontrib><creatorcontrib>Miccichè, Francesco</creatorcontrib><creatorcontrib>Gallus, Roberto</creatorcontrib><creatorcontrib>Indovina, Luca</creatorcontrib><creatorcontrib>Bussu, Francesco</creatorcontrib><creatorcontrib>Valentini, Vincenzo</creatorcontrib><creatorcontrib>Cusumano, Davide</creatorcontrib><creatorcontrib>Mattiucci, Gian Carlo</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Public Health Database</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>International journal of environmental research and public health</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>D’Aviero, Andrea</au><au>Re, Alessia</au><au>Catucci, Francesco</au><au>Piccari, Danila</au><au>Votta, Claudio</au><au>Piro, Domenico</au><au>Piras, Antonio</au><au>Di Dio, Carmela</au><au>Iezzi, Martina</au><au>Preziosi, Francesco</au><au>Menna, Sebastiano</au><au>Quaranta, Flaviovincenzo</au><au>Boschetti, Althea</au><au>Marras, Marco</au><au>Miccichè, Francesco</au><au>Gallus, Roberto</au><au>Indovina, Luca</au><au>Bussu, Francesco</au><au>Valentini, Vincenzo</au><au>Cusumano, Davide</au><au>Mattiucci, Gian Carlo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Clinical Validation of a Deep-Learning Segmentation Software in Head and Neck: An Early Analysis in a Developing Radiation Oncology Center</atitle><jtitle>International journal of environmental research and public health</jtitle><date>2022-07-25</date><risdate>2022</risdate><volume>19</volume><issue>15</issue><spage>9057</spage><pages>9057-</pages><issn>1660-4601</issn><issn>1661-7827</issn><eissn>1660-4601</eissn><abstract>Background: Organs at risk (OARs) delineation is a crucial step of radiotherapy (RT) treatment planning workflow. Time-consuming and inter-observer variability are main issues in manual OAR delineation, mainly in the head and neck (H & N) district. Deep-learning based auto-segmentation is a promising strategy to improve OARs contouring in radiotherapy departments. A comparison of deep-learning-generated auto-contours (AC) with manual contours (MC) was performed by three expert radiation oncologists from a single center. Methods: Planning computed tomography (CT) scans of patients undergoing RT treatments for H&N cancers were considered. CT scans were processed by Limbus Contour auto-segmentation software, a commercial deep-learning auto-segmentation based software to generate AC. H&N protocol was used to perform AC, with the structure set consisting of bilateral brachial plexus, brain, brainstem, bilateral cochlea, pharyngeal constrictors, eye globes, bilateral lens, mandible, optic chiasm, bilateral optic nerves, oral cavity, bilateral parotids, spinal cord, bilateral submandibular glands, lips and thyroid. Manual revision of OARs was performed according to international consensus guidelines. The AC and MC were compared using the Dice similarity coefficient (DSC) and 95% Hausdorff distance transform (DT). Results: A total of 274 contours obtained by processing CT scans were included in the analysis. The highest values of DSC were obtained for the brain (DSC 1.00), left and right eye globes and the mandible (DSC 0.98). The structures with greater MC editing were optic chiasm, optic nerves and cochleae. Conclusions: In this preliminary analysis, deep-learning auto-segmentation seems to provide acceptable H&N OAR delineations. For less accurate organs, AC could be considered a starting point for review and manual adjustment. Our results suggest that AC could become a useful time-saving tool to optimize workload and resources in RT departments.</abstract><cop>Basel</cop><pub>MDPI AG</pub><pmid>35897425</pmid><doi>10.3390/ijerph19159057</doi><orcidid>https://orcid.org/0000-0003-2619-4918</orcidid><orcidid>https://orcid.org/0000-0001-6780-6621</orcidid><orcidid>https://orcid.org/0000-0003-3343-2806</orcidid><orcidid>https://orcid.org/0000-0001-6261-2772</orcidid><orcidid>https://orcid.org/0000-0001-8073-9005</orcidid><orcidid>https://orcid.org/0000-0003-0556-3593</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1660-4601 |
ispartof | International journal of environmental research and public health, 2022-07, Vol.19 (15), p.9057 |
issn | 1660-4601 1661-7827 1660-4601 |
language | eng |
recordid | cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9329735 |
source | MDPI - Multidisciplinary Digital Publishing Institute; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; PubMed Central; Free Full-Text Journals in Chemistry; PubMed Central Open Access |
subjects | Brain stem Cancer therapies Cochlea Computed tomography Constrictors Contouring Contours Deep learning Delineation Exports Eye Eye lens Head & neck cancer Learning Metric space Nerves Oncology Optic chiasm Oral cavity Radiation Radiation therapy Segmentation Software Thyroid Tumors Work stations Workflow |
title | Clinical Validation of a Deep-Learning Segmentation Software in Head and Neck: An Early Analysis in a Developing Radiation Oncology Center |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T07%3A23%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Clinical%20Validation%20of%20a%20Deep-Learning%20Segmentation%20Software%20in%20Head%20and%20Neck:%20An%20Early%20Analysis%20in%20a%20Developing%20Radiation%20Oncology%20Center&rft.jtitle=International%20journal%20of%20environmental%20research%20and%20public%20health&rft.au=D%E2%80%99Aviero,%20Andrea&rft.date=2022-07-25&rft.volume=19&rft.issue=15&rft.spage=9057&rft.pages=9057-&rft.issn=1660-4601&rft.eissn=1660-4601&rft_id=info:doi/10.3390/ijerph19159057&rft_dat=%3Cproquest_pubme%3E2700607542%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2700607542&rft_id=info:pmid/35897425&rfr_iscdi=true |