Improving Surgical Situational Awareness with Signed Distance Field: A Pilot Study in Virtual Reality

The introduction of image-guided surgical navigation (IGSN) has greatly benefited technically demanding surgical procedures by providing real-time support and guidance to the surgeon during surgery. To develop effective IGSN, a careful selection of the surgical information and the medium to present...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-12
Hauptverfasser: Ishida, Hisashi, Barragan, Juan Antonio, Munawar, Adnan, Li, Zhaoshuo, Ding, Andy, Kazanzides, Peter, Trakimas, Danielle, Creighton, Francis X, Taylor, Russell H
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Ishida, Hisashi
Barragan, Juan Antonio
Munawar, Adnan
Li, Zhaoshuo
Ding, Andy
Kazanzides, Peter
Trakimas, Danielle
Creighton, Francis X
Taylor, Russell H
description The introduction of image-guided surgical navigation (IGSN) has greatly benefited technically demanding surgical procedures by providing real-time support and guidance to the surgeon during surgery. To develop effective IGSN, a careful selection of the surgical information and the medium to present this information to the surgeon is needed. However, this is not a trivial task due to the broad array of available options. To address this problem, we have developed an open-source library that facilitates the development of multimodal navigation systems in a wide range of surgical procedures relying on medical imaging data. To provide guidance, our system calculates the minimum distance between the surgical instrument and the anatomy and then presents this information to the user through different mechanisms. The real-time performance of our approach is achieved by calculating Signed Distance Fields at initialization from segmented anatomical volumes. Using this framework, we developed a multimodal surgical navigation system to help surgeons navigate anatomical variability in a skull base surgery simulation environment. Three different feedback modalities were explored: visual, auditory, and haptic. To evaluate the proposed system, a pilot user study was conducted in which four clinicians performed mastoidectomy procedures with and without guidance. Each condition was assessed using objective performance and subjective workload metrics. This pilot user study showed improvements in procedural safety without additional time or workload. These results demonstrate our pipeline's successful use case in the context of mastoidectomy.
doi_str_mv 10.48550/arxiv.2303.01733
format Article
fullrecord <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_2303_01733</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2783520583</sourcerecordid><originalsourceid>FETCH-LOGICAL-a953-b57eb9a4666cf3bb6461749c0e8060656d271314f5969316b85a7843750a5c493</originalsourceid><addsrcrecordid>eNotkN9LwzAcxIMgOOb-AJ8M-NyZ5Jtf9W1MNwcDxQ1fS9pmM6NrZ5Ju9r-3bj7dwR0H90HojpIx10KQR-N_3HHMgMCYUAVwhQYMgCaaM3aDRiHsCCFMKiYEDJBd7A--Obp6i1et37rCVHjlYmuia-reT07G29qGgE8ufvXRtrYlfnYhmrqweOZsVT7hCX53VRPxKrZlh12NP53vNyr8YU3lYneLrjemCnb0r0O0nr2sp6_J8m2-mE6WiUkFJLlQNk8Nl1IWG8hzySVVPC2I1UQSKWTJFAXKNyKVKVCZa2GU5qAEMaLgKQzR_WX2zCA7eLc3vsv-WGRnFn3j4dLoT3-3NsRs17S-PxoypjQIRoQG-AWpHmB8</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2783520583</pqid></control><display><type>article</type><title>Improving Surgical Situational Awareness with Signed Distance Field: A Pilot Study in Virtual Reality</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Ishida, Hisashi ; Barragan, Juan Antonio ; Munawar, Adnan ; Li, Zhaoshuo ; Ding, Andy ; Kazanzides, Peter ; Trakimas, Danielle ; Creighton, Francis X ; Taylor, Russell H</creator><creatorcontrib>Ishida, Hisashi ; Barragan, Juan Antonio ; Munawar, Adnan ; Li, Zhaoshuo ; Ding, Andy ; Kazanzides, Peter ; Trakimas, Danielle ; Creighton, Francis X ; Taylor, Russell H</creatorcontrib><description>The introduction of image-guided surgical navigation (IGSN) has greatly benefited technically demanding surgical procedures by providing real-time support and guidance to the surgeon during surgery. To develop effective IGSN, a careful selection of the surgical information and the medium to present this information to the surgeon is needed. However, this is not a trivial task due to the broad array of available options. To address this problem, we have developed an open-source library that facilitates the development of multimodal navigation systems in a wide range of surgical procedures relying on medical imaging data. To provide guidance, our system calculates the minimum distance between the surgical instrument and the anatomy and then presents this information to the user through different mechanisms. The real-time performance of our approach is achieved by calculating Signed Distance Fields at initialization from segmented anatomical volumes. Using this framework, we developed a multimodal surgical navigation system to help surgeons navigate anatomical variability in a skull base surgery simulation environment. Three different feedback modalities were explored: visual, auditory, and haptic. To evaluate the proposed system, a pilot user study was conducted in which four clinicians performed mastoidectomy procedures with and without guidance. Each condition was assessed using objective performance and subjective workload metrics. This pilot user study showed improvements in procedural safety without additional time or workload. These results demonstrate our pipeline's successful use case in the context of mastoidectomy.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.2303.01733</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Computer Science - Human-Computer Interaction ; Computer Science - Robotics ; Feedback ; Mathematical analysis ; Medical imaging ; Navigation systems ; Real time ; Situational awareness ; Surgeons ; Surgery ; Surgical instruments ; Virtual reality ; Workload ; Workloads</subject><ispartof>arXiv.org, 2024-12</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,784,885,27923</link.rule.ids><backlink>$$Uhttps://doi.org/10.1109/IROS55552.2023.10342004$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.48550/arXiv.2303.01733$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Ishida, Hisashi</creatorcontrib><creatorcontrib>Barragan, Juan Antonio</creatorcontrib><creatorcontrib>Munawar, Adnan</creatorcontrib><creatorcontrib>Li, Zhaoshuo</creatorcontrib><creatorcontrib>Ding, Andy</creatorcontrib><creatorcontrib>Kazanzides, Peter</creatorcontrib><creatorcontrib>Trakimas, Danielle</creatorcontrib><creatorcontrib>Creighton, Francis X</creatorcontrib><creatorcontrib>Taylor, Russell H</creatorcontrib><title>Improving Surgical Situational Awareness with Signed Distance Field: A Pilot Study in Virtual Reality</title><title>arXiv.org</title><description>The introduction of image-guided surgical navigation (IGSN) has greatly benefited technically demanding surgical procedures by providing real-time support and guidance to the surgeon during surgery. To develop effective IGSN, a careful selection of the surgical information and the medium to present this information to the surgeon is needed. However, this is not a trivial task due to the broad array of available options. To address this problem, we have developed an open-source library that facilitates the development of multimodal navigation systems in a wide range of surgical procedures relying on medical imaging data. To provide guidance, our system calculates the minimum distance between the surgical instrument and the anatomy and then presents this information to the user through different mechanisms. The real-time performance of our approach is achieved by calculating Signed Distance Fields at initialization from segmented anatomical volumes. Using this framework, we developed a multimodal surgical navigation system to help surgeons navigate anatomical variability in a skull base surgery simulation environment. Three different feedback modalities were explored: visual, auditory, and haptic. To evaluate the proposed system, a pilot user study was conducted in which four clinicians performed mastoidectomy procedures with and without guidance. Each condition was assessed using objective performance and subjective workload metrics. This pilot user study showed improvements in procedural safety without additional time or workload. These results demonstrate our pipeline's successful use case in the context of mastoidectomy.</description><subject>Computer Science - Human-Computer Interaction</subject><subject>Computer Science - Robotics</subject><subject>Feedback</subject><subject>Mathematical analysis</subject><subject>Medical imaging</subject><subject>Navigation systems</subject><subject>Real time</subject><subject>Situational awareness</subject><subject>Surgeons</subject><subject>Surgery</subject><subject>Surgical instruments</subject><subject>Virtual reality</subject><subject>Workload</subject><subject>Workloads</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotkN9LwzAcxIMgOOb-AJ8M-NyZ5Jtf9W1MNwcDxQ1fS9pmM6NrZ5Ju9r-3bj7dwR0H90HojpIx10KQR-N_3HHMgMCYUAVwhQYMgCaaM3aDRiHsCCFMKiYEDJBd7A--Obp6i1et37rCVHjlYmuia-reT07G29qGgE8ufvXRtrYlfnYhmrqweOZsVT7hCX53VRPxKrZlh12NP53vNyr8YU3lYneLrjemCnb0r0O0nr2sp6_J8m2-mE6WiUkFJLlQNk8Nl1IWG8hzySVVPC2I1UQSKWTJFAXKNyKVKVCZa2GU5qAEMaLgKQzR_WX2zCA7eLc3vsv-WGRnFn3j4dLoT3-3NsRs17S-PxoypjQIRoQG-AWpHmB8</recordid><startdate>20241213</startdate><enddate>20241213</enddate><creator>Ishida, Hisashi</creator><creator>Barragan, Juan Antonio</creator><creator>Munawar, Adnan</creator><creator>Li, Zhaoshuo</creator><creator>Ding, Andy</creator><creator>Kazanzides, Peter</creator><creator>Trakimas, Danielle</creator><creator>Creighton, Francis X</creator><creator>Taylor, Russell H</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20241213</creationdate><title>Improving Surgical Situational Awareness with Signed Distance Field: A Pilot Study in Virtual Reality</title><author>Ishida, Hisashi ; Barragan, Juan Antonio ; Munawar, Adnan ; Li, Zhaoshuo ; Ding, Andy ; Kazanzides, Peter ; Trakimas, Danielle ; Creighton, Francis X ; Taylor, Russell H</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a953-b57eb9a4666cf3bb6461749c0e8060656d271314f5969316b85a7843750a5c493</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computer Science - Human-Computer Interaction</topic><topic>Computer Science - Robotics</topic><topic>Feedback</topic><topic>Mathematical analysis</topic><topic>Medical imaging</topic><topic>Navigation systems</topic><topic>Real time</topic><topic>Situational awareness</topic><topic>Surgeons</topic><topic>Surgery</topic><topic>Surgical instruments</topic><topic>Virtual reality</topic><topic>Workload</topic><topic>Workloads</topic><toplevel>online_resources</toplevel><creatorcontrib>Ishida, Hisashi</creatorcontrib><creatorcontrib>Barragan, Juan Antonio</creatorcontrib><creatorcontrib>Munawar, Adnan</creatorcontrib><creatorcontrib>Li, Zhaoshuo</creatorcontrib><creatorcontrib>Ding, Andy</creatorcontrib><creatorcontrib>Kazanzides, Peter</creatorcontrib><creatorcontrib>Trakimas, Danielle</creatorcontrib><creatorcontrib>Creighton, Francis X</creatorcontrib><creatorcontrib>Taylor, Russell H</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ishida, Hisashi</au><au>Barragan, Juan Antonio</au><au>Munawar, Adnan</au><au>Li, Zhaoshuo</au><au>Ding, Andy</au><au>Kazanzides, Peter</au><au>Trakimas, Danielle</au><au>Creighton, Francis X</au><au>Taylor, Russell H</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Improving Surgical Situational Awareness with Signed Distance Field: A Pilot Study in Virtual Reality</atitle><jtitle>arXiv.org</jtitle><date>2024-12-13</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>The introduction of image-guided surgical navigation (IGSN) has greatly benefited technically demanding surgical procedures by providing real-time support and guidance to the surgeon during surgery. To develop effective IGSN, a careful selection of the surgical information and the medium to present this information to the surgeon is needed. However, this is not a trivial task due to the broad array of available options. To address this problem, we have developed an open-source library that facilitates the development of multimodal navigation systems in a wide range of surgical procedures relying on medical imaging data. To provide guidance, our system calculates the minimum distance between the surgical instrument and the anatomy and then presents this information to the user through different mechanisms. The real-time performance of our approach is achieved by calculating Signed Distance Fields at initialization from segmented anatomical volumes. Using this framework, we developed a multimodal surgical navigation system to help surgeons navigate anatomical variability in a skull base surgery simulation environment. Three different feedback modalities were explored: visual, auditory, and haptic. To evaluate the proposed system, a pilot user study was conducted in which four clinicians performed mastoidectomy procedures with and without guidance. Each condition was assessed using objective performance and subjective workload metrics. This pilot user study showed improvements in procedural safety without additional time or workload. These results demonstrate our pipeline's successful use case in the context of mastoidectomy.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.2303.01733</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-12
issn 2331-8422
language eng
recordid cdi_arxiv_primary_2303_01733
source arXiv.org; Free E- Journals
subjects Computer Science - Human-Computer Interaction
Computer Science - Robotics
Feedback
Mathematical analysis
Medical imaging
Navigation systems
Real time
Situational awareness
Surgeons
Surgery
Surgical instruments
Virtual reality
Workload
Workloads
title Improving Surgical Situational Awareness with Signed Distance Field: A Pilot Study in Virtual Reality
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-09T18%3A05%3A45IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Improving%20Surgical%20Situational%20Awareness%20with%20Signed%20Distance%20Field:%20A%20Pilot%20Study%20in%20Virtual%20Reality&rft.jtitle=arXiv.org&rft.au=Ishida,%20Hisashi&rft.date=2024-12-13&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.2303.01733&rft_dat=%3Cproquest_arxiv%3E2783520583%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2783520583&rft_id=info:pmid/&rfr_iscdi=true