Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients

Introduction A challenge in implementing deformable image registration (DIR) in radiation therapy planning is effectively communicating registration accuracy to the radiation oncologist. This study aimed to evaluate the MIM® quality assurance (QA) tool for rating DIR accuracy. Methods Retrospective...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of medical radiation sciences 2020-12, Vol.67 (4), p.284-293
Hauptverfasser: Mee, Molly, Stewart, Kate, Lathouras, Marika, Truong, Helen, Hargrave, Catriona
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 293
container_issue 4
container_start_page 284
container_title Journal of medical radiation sciences
container_volume 67
creator Mee, Molly
Stewart, Kate
Lathouras, Marika
Truong, Helen
Hargrave, Catriona
description Introduction A challenge in implementing deformable image registration (DIR) in radiation therapy planning is effectively communicating registration accuracy to the radiation oncologist. This study aimed to evaluate the MIM® quality assurance (QA) tool for rating DIR accuracy. Methods Retrospective DIR was performed on CT images for 35 head and neck cancer patients. The QA tool was used to rate DIR accuracy as good, fair or bad. Thirty registered patient images were assessed independently by three RTs and a further five patients assessed by five RTs. Ratings were evaluated by comparison of Hausdorff Distance (HD), Mean Distance to Agreement (MDA), Dice Similarity Coefficients (DSC) and Jacobian determinants for parotid and mandible subregions on the two CTs post‐DIR. Inter‐operator reliability was assessed using Krippendorff's alpha coefficient (KALPA). Rating time and volume measures for each rating were also calculated. Results Quantitative metrics calculated for most anatomical subregions reflected the expected trend by registration accuracy, with good obtaining the most ideal values on average (HD = 7.50 ± 3.18, MDA = 0.64 ± 0.47, DSC = 0.90 ± 0.07, Jacobian = 0.95 ± 0.06). Highest inter‐operator reliability was observed for good ratings and within the parotids (KALPA 0.66–0.93), whilst ratings varied the most in regions of dental artefact. Overall, average rating time was 33 minutes and the least commonly applied rating by volume was fair. Conclusion Results from qualitative and quantitative data, operator rating differences and rating time suggest highlighting only bad regions of DIR accuracy and implementing clinical guidelines and RT training for consistent and efficient use of the QA tool. As deformable image registration (DIR) is becoming increasingly used in clinical practice, this study aimed to evaluate a new QA tool to rate the accuracy of DIR of head and neck cancer patient planning and diagnostic imaging. Regions on deformed images were qualitatively evaluated using good, fair and bad rating levels which were 1) compared to quantitative metrics recommended in the AAPM TG132 Report and 2) compared inter‐operator ratings using Krippendorff's alpha reliability test. Results suggest to only utilise qualitative assessments for only the bad rating level, as well as develop clinical guidelines and training to support the clinical implementation of the QA tool.
doi_str_mv 10.1002/jmrs.428
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_7754017</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2492281441</sourcerecordid><originalsourceid>FETCH-LOGICAL-c4668-eb92201f2cd0da03c12a9cb02bfa59bcad10f044e28e53b0510107a31df7a02a3</originalsourceid><addsrcrecordid>eNp1kU9r3DAQxUVpaUIa6Ccogl56cTqS7JV9KZSQ9A8JgbQ9i7E82ngrWxvJTtlvX5lN0qbQkwbmN0_v8Rh7LeBEAMj3myGmk1LWz9ihhEoUqoHm-cNcN9UBO05pAwACtJQNvGQHSq1EpVV9yNzZHfoZpz6MPDiOvCMX4oCtJ94PuCYead2nKe6R2xl9P-04pjRHHC3xKQTP8wm_Iew4jh0fyf7kdllGvs1nNE7pFXvh0Cc6vn-P2I_zs--nn4uLq09fTj9eFLZcreqC2kZKEE7aDjoEZYXExrYgW4dV01rsBDgoS5I1VarNCZdQqETnNIJEdcQ-7HW3cztQZ_PfEb3Zxpwl7kzA3jzdjP2NWYc7o3VVgtBZ4N29QAy3M6XJDH2y5D2OFOZkZJkd1qIsRUbf_oNuwhzHHC9TK61roTT8EbQxpBTJPZoRYJb-zNKfyf1l9M3f5h_Bh7YyUOyBX72n3X-FzNfL62-L4G-pDqZg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2467781370</pqid></control><display><type>article</type><title>Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients</title><source>MEDLINE</source><source>Wiley Online Library Open Access</source><source>DOAJ Directory of Open Access Journals</source><source>Wiley Online Library Journals Frontfile Complete</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>PubMed Central</source><creator>Mee, Molly ; Stewart, Kate ; Lathouras, Marika ; Truong, Helen ; Hargrave, Catriona</creator><creatorcontrib>Mee, Molly ; Stewart, Kate ; Lathouras, Marika ; Truong, Helen ; Hargrave, Catriona</creatorcontrib><description>Introduction A challenge in implementing deformable image registration (DIR) in radiation therapy planning is effectively communicating registration accuracy to the radiation oncologist. This study aimed to evaluate the MIM® quality assurance (QA) tool for rating DIR accuracy. Methods Retrospective DIR was performed on CT images for 35 head and neck cancer patients. The QA tool was used to rate DIR accuracy as good, fair or bad. Thirty registered patient images were assessed independently by three RTs and a further five patients assessed by five RTs. Ratings were evaluated by comparison of Hausdorff Distance (HD), Mean Distance to Agreement (MDA), Dice Similarity Coefficients (DSC) and Jacobian determinants for parotid and mandible subregions on the two CTs post‐DIR. Inter‐operator reliability was assessed using Krippendorff's alpha coefficient (KALPA). Rating time and volume measures for each rating were also calculated. Results Quantitative metrics calculated for most anatomical subregions reflected the expected trend by registration accuracy, with good obtaining the most ideal values on average (HD = 7.50 ± 3.18, MDA = 0.64 ± 0.47, DSC = 0.90 ± 0.07, Jacobian = 0.95 ± 0.06). Highest inter‐operator reliability was observed for good ratings and within the parotids (KALPA 0.66–0.93), whilst ratings varied the most in regions of dental artefact. Overall, average rating time was 33 minutes and the least commonly applied rating by volume was fair. Conclusion Results from qualitative and quantitative data, operator rating differences and rating time suggest highlighting only bad regions of DIR accuracy and implementing clinical guidelines and RT training for consistent and efficient use of the QA tool. As deformable image registration (DIR) is becoming increasingly used in clinical practice, this study aimed to evaluate a new QA tool to rate the accuracy of DIR of head and neck cancer patient planning and diagnostic imaging. Regions on deformed images were qualitatively evaluated using good, fair and bad rating levels which were 1) compared to quantitative metrics recommended in the AAPM TG132 Report and 2) compared inter‐operator ratings using Krippendorff's alpha reliability test. Results suggest to only utilise qualitative assessments for only the bad rating level, as well as develop clinical guidelines and training to support the clinical implementation of the QA tool.</description><identifier>ISSN: 2051-3895</identifier><identifier>ISSN: 2051-3909</identifier><identifier>EISSN: 2051-3909</identifier><identifier>DOI: 10.1002/jmrs.428</identifier><identifier>PMID: 33615738</identifier><language>eng</language><publisher>United States: John Wiley &amp; Sons, Inc</publisher><subject>Accuracy ; deformable image registration ; Female ; Head &amp; neck cancer ; Head and Neck Neoplasms - diagnostic imaging ; Head and Neck Neoplasms - radiotherapy ; Humans ; Image Processing, Computer-Assisted - methods ; Image Processing, Computer-Assisted - standards ; Male ; Medical imaging ; Original ; Patients ; quality assurance ; Quality Assurance, Health Care - standards ; Quality control ; Radiation therapy ; Radiotherapy Planning, Computer-Assisted - methods ; Radiotherapy Planning, Computer-Assisted - standards ; Ratings &amp; rankings ; Registration ; Retrospective Studies ; Tomography, X-Ray Computed - standards ; treatment planning ; Trends</subject><ispartof>Journal of medical radiation sciences, 2020-12, Vol.67 (4), p.284-293</ispartof><rights>2020 The Authors. Journal of Medical Radiation Sciences published by John Wiley &amp; Sons Australia, Ltd on behalf of Australian Society of Medical Imaging and Radiation Therapy and New Zealand Institute of Medical Radiation Technology</rights><rights>2020 The Authors. Journal of Medical Radiation Sciences published by John Wiley &amp; Sons Australia, Ltd on behalf of Australian Society of Medical Imaging and Radiation Therapy and New Zealand Institute of Medical Radiation Technology.</rights><rights>2020. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c4668-eb92201f2cd0da03c12a9cb02bfa59bcad10f044e28e53b0510107a31df7a02a3</citedby><cites>FETCH-LOGICAL-c4668-eb92201f2cd0da03c12a9cb02bfa59bcad10f044e28e53b0510107a31df7a02a3</cites><orcidid>0000-0001-7493-3813 ; 0000-0001-8676-6565 ; 0000-0002-9031-864X ; 0000-0003-4298-2228</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC7754017/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC7754017/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,860,881,1411,11541,27901,27902,45550,45551,46027,46451,53766,53768</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33615738$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Mee, Molly</creatorcontrib><creatorcontrib>Stewart, Kate</creatorcontrib><creatorcontrib>Lathouras, Marika</creatorcontrib><creatorcontrib>Truong, Helen</creatorcontrib><creatorcontrib>Hargrave, Catriona</creatorcontrib><title>Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients</title><title>Journal of medical radiation sciences</title><addtitle>J Med Radiat Sci</addtitle><description>Introduction A challenge in implementing deformable image registration (DIR) in radiation therapy planning is effectively communicating registration accuracy to the radiation oncologist. This study aimed to evaluate the MIM® quality assurance (QA) tool for rating DIR accuracy. Methods Retrospective DIR was performed on CT images for 35 head and neck cancer patients. The QA tool was used to rate DIR accuracy as good, fair or bad. Thirty registered patient images were assessed independently by three RTs and a further five patients assessed by five RTs. Ratings were evaluated by comparison of Hausdorff Distance (HD), Mean Distance to Agreement (MDA), Dice Similarity Coefficients (DSC) and Jacobian determinants for parotid and mandible subregions on the two CTs post‐DIR. Inter‐operator reliability was assessed using Krippendorff's alpha coefficient (KALPA). Rating time and volume measures for each rating were also calculated. Results Quantitative metrics calculated for most anatomical subregions reflected the expected trend by registration accuracy, with good obtaining the most ideal values on average (HD = 7.50 ± 3.18, MDA = 0.64 ± 0.47, DSC = 0.90 ± 0.07, Jacobian = 0.95 ± 0.06). Highest inter‐operator reliability was observed for good ratings and within the parotids (KALPA 0.66–0.93), whilst ratings varied the most in regions of dental artefact. Overall, average rating time was 33 minutes and the least commonly applied rating by volume was fair. Conclusion Results from qualitative and quantitative data, operator rating differences and rating time suggest highlighting only bad regions of DIR accuracy and implementing clinical guidelines and RT training for consistent and efficient use of the QA tool. As deformable image registration (DIR) is becoming increasingly used in clinical practice, this study aimed to evaluate a new QA tool to rate the accuracy of DIR of head and neck cancer patient planning and diagnostic imaging. Regions on deformed images were qualitatively evaluated using good, fair and bad rating levels which were 1) compared to quantitative metrics recommended in the AAPM TG132 Report and 2) compared inter‐operator ratings using Krippendorff's alpha reliability test. Results suggest to only utilise qualitative assessments for only the bad rating level, as well as develop clinical guidelines and training to support the clinical implementation of the QA tool.</description><subject>Accuracy</subject><subject>deformable image registration</subject><subject>Female</subject><subject>Head &amp; neck cancer</subject><subject>Head and Neck Neoplasms - diagnostic imaging</subject><subject>Head and Neck Neoplasms - radiotherapy</subject><subject>Humans</subject><subject>Image Processing, Computer-Assisted - methods</subject><subject>Image Processing, Computer-Assisted - standards</subject><subject>Male</subject><subject>Medical imaging</subject><subject>Original</subject><subject>Patients</subject><subject>quality assurance</subject><subject>Quality Assurance, Health Care - standards</subject><subject>Quality control</subject><subject>Radiation therapy</subject><subject>Radiotherapy Planning, Computer-Assisted - methods</subject><subject>Radiotherapy Planning, Computer-Assisted - standards</subject><subject>Ratings &amp; rankings</subject><subject>Registration</subject><subject>Retrospective Studies</subject><subject>Tomography, X-Ray Computed - standards</subject><subject>treatment planning</subject><subject>Trends</subject><issn>2051-3895</issn><issn>2051-3909</issn><issn>2051-3909</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>24P</sourceid><sourceid>EIF</sourceid><sourceid>BENPR</sourceid><recordid>eNp1kU9r3DAQxUVpaUIa6Ccogl56cTqS7JV9KZSQ9A8JgbQ9i7E82ngrWxvJTtlvX5lN0qbQkwbmN0_v8Rh7LeBEAMj3myGmk1LWz9ihhEoUqoHm-cNcN9UBO05pAwACtJQNvGQHSq1EpVV9yNzZHfoZpz6MPDiOvCMX4oCtJ94PuCYead2nKe6R2xl9P-04pjRHHC3xKQTP8wm_Iew4jh0fyf7kdllGvs1nNE7pFXvh0Cc6vn-P2I_zs--nn4uLq09fTj9eFLZcreqC2kZKEE7aDjoEZYXExrYgW4dV01rsBDgoS5I1VarNCZdQqETnNIJEdcQ-7HW3cztQZ_PfEb3Zxpwl7kzA3jzdjP2NWYc7o3VVgtBZ4N29QAy3M6XJDH2y5D2OFOZkZJkd1qIsRUbf_oNuwhzHHC9TK61roTT8EbQxpBTJPZoRYJb-zNKfyf1l9M3f5h_Bh7YyUOyBX72n3X-FzNfL62-L4G-pDqZg</recordid><startdate>202012</startdate><enddate>202012</enddate><creator>Mee, Molly</creator><creator>Stewart, Kate</creator><creator>Lathouras, Marika</creator><creator>Truong, Helen</creator><creator>Hargrave, Catriona</creator><general>John Wiley &amp; Sons, Inc</general><general>John Wiley and Sons Inc</general><scope>24P</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7X7</scope><scope>7XB</scope><scope>88C</scope><scope>88I</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB0</scope><scope>M0S</scope><scope>M0T</scope><scope>M2P</scope><scope>NAPCQ</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0001-7493-3813</orcidid><orcidid>https://orcid.org/0000-0001-8676-6565</orcidid><orcidid>https://orcid.org/0000-0002-9031-864X</orcidid><orcidid>https://orcid.org/0000-0003-4298-2228</orcidid></search><sort><creationdate>202012</creationdate><title>Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients</title><author>Mee, Molly ; Stewart, Kate ; Lathouras, Marika ; Truong, Helen ; Hargrave, Catriona</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c4668-eb92201f2cd0da03c12a9cb02bfa59bcad10f044e28e53b0510107a31df7a02a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Accuracy</topic><topic>deformable image registration</topic><topic>Female</topic><topic>Head &amp; neck cancer</topic><topic>Head and Neck Neoplasms - diagnostic imaging</topic><topic>Head and Neck Neoplasms - radiotherapy</topic><topic>Humans</topic><topic>Image Processing, Computer-Assisted - methods</topic><topic>Image Processing, Computer-Assisted - standards</topic><topic>Male</topic><topic>Medical imaging</topic><topic>Original</topic><topic>Patients</topic><topic>quality assurance</topic><topic>Quality Assurance, Health Care - standards</topic><topic>Quality control</topic><topic>Radiation therapy</topic><topic>Radiotherapy Planning, Computer-Assisted - methods</topic><topic>Radiotherapy Planning, Computer-Assisted - standards</topic><topic>Ratings &amp; rankings</topic><topic>Registration</topic><topic>Retrospective Studies</topic><topic>Tomography, X-Ray Computed - standards</topic><topic>treatment planning</topic><topic>Trends</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mee, Molly</creatorcontrib><creatorcontrib>Stewart, Kate</creatorcontrib><creatorcontrib>Lathouras, Marika</creatorcontrib><creatorcontrib>Truong, Helen</creatorcontrib><creatorcontrib>Hargrave, Catriona</creatorcontrib><collection>Wiley Online Library Open Access</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Healthcare Administration Database (Alumni)</collection><collection>Science Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Healthcare Administration Database</collection><collection>Science Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Journal of medical radiation sciences</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Mee, Molly</au><au>Stewart, Kate</au><au>Lathouras, Marika</au><au>Truong, Helen</au><au>Hargrave, Catriona</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients</atitle><jtitle>Journal of medical radiation sciences</jtitle><addtitle>J Med Radiat Sci</addtitle><date>2020-12</date><risdate>2020</risdate><volume>67</volume><issue>4</issue><spage>284</spage><epage>293</epage><pages>284-293</pages><issn>2051-3895</issn><issn>2051-3909</issn><eissn>2051-3909</eissn><abstract>Introduction A challenge in implementing deformable image registration (DIR) in radiation therapy planning is effectively communicating registration accuracy to the radiation oncologist. This study aimed to evaluate the MIM® quality assurance (QA) tool for rating DIR accuracy. Methods Retrospective DIR was performed on CT images for 35 head and neck cancer patients. The QA tool was used to rate DIR accuracy as good, fair or bad. Thirty registered patient images were assessed independently by three RTs and a further five patients assessed by five RTs. Ratings were evaluated by comparison of Hausdorff Distance (HD), Mean Distance to Agreement (MDA), Dice Similarity Coefficients (DSC) and Jacobian determinants for parotid and mandible subregions on the two CTs post‐DIR. Inter‐operator reliability was assessed using Krippendorff's alpha coefficient (KALPA). Rating time and volume measures for each rating were also calculated. Results Quantitative metrics calculated for most anatomical subregions reflected the expected trend by registration accuracy, with good obtaining the most ideal values on average (HD = 7.50 ± 3.18, MDA = 0.64 ± 0.47, DSC = 0.90 ± 0.07, Jacobian = 0.95 ± 0.06). Highest inter‐operator reliability was observed for good ratings and within the parotids (KALPA 0.66–0.93), whilst ratings varied the most in regions of dental artefact. Overall, average rating time was 33 minutes and the least commonly applied rating by volume was fair. Conclusion Results from qualitative and quantitative data, operator rating differences and rating time suggest highlighting only bad regions of DIR accuracy and implementing clinical guidelines and RT training for consistent and efficient use of the QA tool. As deformable image registration (DIR) is becoming increasingly used in clinical practice, this study aimed to evaluate a new QA tool to rate the accuracy of DIR of head and neck cancer patient planning and diagnostic imaging. Regions on deformed images were qualitatively evaluated using good, fair and bad rating levels which were 1) compared to quantitative metrics recommended in the AAPM TG132 Report and 2) compared inter‐operator ratings using Krippendorff's alpha reliability test. Results suggest to only utilise qualitative assessments for only the bad rating level, as well as develop clinical guidelines and training to support the clinical implementation of the QA tool.</abstract><cop>United States</cop><pub>John Wiley &amp; Sons, Inc</pub><pmid>33615738</pmid><doi>10.1002/jmrs.428</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0001-7493-3813</orcidid><orcidid>https://orcid.org/0000-0001-8676-6565</orcidid><orcidid>https://orcid.org/0000-0002-9031-864X</orcidid><orcidid>https://orcid.org/0000-0003-4298-2228</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2051-3895
ispartof Journal of medical radiation sciences, 2020-12, Vol.67 (4), p.284-293
issn 2051-3895
2051-3909
2051-3909
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_7754017
source MEDLINE; Wiley Online Library Open Access; DOAJ Directory of Open Access Journals; Wiley Online Library Journals Frontfile Complete; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; PubMed Central
subjects Accuracy
deformable image registration
Female
Head & neck cancer
Head and Neck Neoplasms - diagnostic imaging
Head and Neck Neoplasms - radiotherapy
Humans
Image Processing, Computer-Assisted - methods
Image Processing, Computer-Assisted - standards
Male
Medical imaging
Original
Patients
quality assurance
Quality Assurance, Health Care - standards
Quality control
Radiation therapy
Radiotherapy Planning, Computer-Assisted - methods
Radiotherapy Planning, Computer-Assisted - standards
Ratings & rankings
Registration
Retrospective Studies
Tomography, X-Ray Computed - standards
treatment planning
Trends
title Evaluation of a deformable image registration quality assurance tool for head and neck cancer patients
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T23%3A54%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Evaluation%20of%20a%20deformable%20image%20registration%20quality%20assurance%20tool%20for%20head%20and%20neck%20cancer%20patients&rft.jtitle=Journal%20of%20medical%20radiation%20sciences&rft.au=Mee,%20Molly&rft.date=2020-12&rft.volume=67&rft.issue=4&rft.spage=284&rft.epage=293&rft.pages=284-293&rft.issn=2051-3895&rft.eissn=2051-3909&rft_id=info:doi/10.1002/jmrs.428&rft_dat=%3Cproquest_pubme%3E2492281441%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2467781370&rft_id=info:pmid/33615738&rfr_iscdi=true