Navigate biopsy with ultrasound under augmented reality device: Towards higher system performance

Biopsies play a crucial role in determining the classification and staging of tumors. Ultrasound is frequently used in this procedure to provide real-time anatomical information. Using augmented reality (AR), surgeons can visualize ultrasound data and spatial navigation information seamlessly integr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers in biology and medicine 2024-05, Vol.174, p.108453-108453, Article 108453
Hauptverfasser: Li, Haowei, Yan, Wenqing, Zhao, Jiasheng, Ji, Yuqi, Qian, Long, Ding, Hui, Zhao, Zhe, Wang, Guangzhi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 108453
container_issue
container_start_page 108453
container_title Computers in biology and medicine
container_volume 174
creator Li, Haowei
Yan, Wenqing
Zhao, Jiasheng
Ji, Yuqi
Qian, Long
Ding, Hui
Zhao, Zhe
Wang, Guangzhi
description Biopsies play a crucial role in determining the classification and staging of tumors. Ultrasound is frequently used in this procedure to provide real-time anatomical information. Using augmented reality (AR), surgeons can visualize ultrasound data and spatial navigation information seamlessly integrated with real tissues. This innovation facilitates faster and more precise biopsy operations. We have developed an augmented reality biopsy navigation system characterized by low display latency and high accuracy. Ultrasound data is initially read by an image capture card and streamed to Unity via net communication. In Unity, navigation information is rendered and transmitted to the HoloLens 2 device using holographic remoting. Concurrently, a retro-reflective tool tracking method is implemented on the HoloLens 2, enabling the simultaneous tracking of the ultrasound probe and biopsy needle. Distinct navigation information is provided during in-plane and out-of-plane punctuation. To evaluate the effectiveness of our system, we conducted a study involving ten participants, assessing puncture accuracy and biopsy time in comparison to traditional methods. Ultrasound image was streamed from the ultrasound device to augmented reality headset with 122.49±11.61ms latency, while only 16.22±11.25ms was taken after data acquisition from image capture card. Navigation accuracy reached 1.23±0.68mm in the image plane and 0.95±0.70mm outside the image plane, within a depth range of 200 millimeters. Remarkably, the utilization of our system led to 98% and 95% success rate in out-of-plane and in-plane biopsy, among ten participants with little ultrasound experience. To sum up, this paper introduces an AR-based ultrasound biopsy navigation system characterized by high navigation accuracy and minimal latency. The system provides distinct visualization contents during in-plane and out-of-plane operations according to their different characteristics. Use case study in this paper proved that our system can help young surgeons perform biopsy faster and more accurately. •Track infrared tools while render remotely for accuracy and system performance.•Simultaneously track multiple infrared tools even when part occluded.•Distinct visualization methods for in-plane and out-of-plane biopsy.•Use case studies adopting different biopsy methods.
doi_str_mv 10.1016/j.compbiomed.2024.108453
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_3043075270</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0010482524005377</els_id><sourcerecordid>3046570764</sourcerecordid><originalsourceid>FETCH-LOGICAL-c317t-e30d1516a7ddeae5f17fbedb36b71b2169adfa81a289e38442b3206aeb84f0673</originalsourceid><addsrcrecordid>eNqFkU1r3DAQhkVpaLZp_0IR9JKLt6MPS9re2tAvCM0lPQvZGu9qsS1Xsjfsv4-WTSjk0sMwMPPMB-9LCGWwZsDUp_26jcPUhDigX3PgspSNrMUrsmJGbyqohXxNVgAMKml4fUne5rwHAAkC3pBLYZRQgusVcb_dIWzdjLRsm_KRPoR5R5d-Ti7HZfS0BCbqlu2A44yeJnR9mI_U4yG0-JnexweXfKa7sN0VMB_zjAOdMHUxDW5s8R256Fyf8f1TviJ_vn-7v_lZ3d79-HXz5bZqBdNzhQI8q5ly2nt0WHdMdw36RqhGs4YztXG-c4Y5bjYojJS8ERyUw8bIDpQWV-T6vHdK8e-CebZDyC32vRsxLtkKkAJ0zTUU9OMLdB-XNJbvTpSqNWglC2XOVJtizgk7O6UwuHS0DOzJBru3_2ywJxvs2YYy-uHpwNKces-Dz7oX4OsZwKLIIWCyuQ1Y1PIhYTtbH8P_rzwCGCefVA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3046570764</pqid></control><display><type>article</type><title>Navigate biopsy with ultrasound under augmented reality device: Towards higher system performance</title><source>MEDLINE</source><source>Access via ScienceDirect (Elsevier)</source><creator>Li, Haowei ; Yan, Wenqing ; Zhao, Jiasheng ; Ji, Yuqi ; Qian, Long ; Ding, Hui ; Zhao, Zhe ; Wang, Guangzhi</creator><creatorcontrib>Li, Haowei ; Yan, Wenqing ; Zhao, Jiasheng ; Ji, Yuqi ; Qian, Long ; Ding, Hui ; Zhao, Zhe ; Wang, Guangzhi</creatorcontrib><description>Biopsies play a crucial role in determining the classification and staging of tumors. Ultrasound is frequently used in this procedure to provide real-time anatomical information. Using augmented reality (AR), surgeons can visualize ultrasound data and spatial navigation information seamlessly integrated with real tissues. This innovation facilitates faster and more precise biopsy operations. We have developed an augmented reality biopsy navigation system characterized by low display latency and high accuracy. Ultrasound data is initially read by an image capture card and streamed to Unity via net communication. In Unity, navigation information is rendered and transmitted to the HoloLens 2 device using holographic remoting. Concurrently, a retro-reflective tool tracking method is implemented on the HoloLens 2, enabling the simultaneous tracking of the ultrasound probe and biopsy needle. Distinct navigation information is provided during in-plane and out-of-plane punctuation. To evaluate the effectiveness of our system, we conducted a study involving ten participants, assessing puncture accuracy and biopsy time in comparison to traditional methods. Ultrasound image was streamed from the ultrasound device to augmented reality headset with 122.49±11.61ms latency, while only 16.22±11.25ms was taken after data acquisition from image capture card. Navigation accuracy reached 1.23±0.68mm in the image plane and 0.95±0.70mm outside the image plane, within a depth range of 200 millimeters. Remarkably, the utilization of our system led to 98% and 95% success rate in out-of-plane and in-plane biopsy, among ten participants with little ultrasound experience. To sum up, this paper introduces an AR-based ultrasound biopsy navigation system characterized by high navigation accuracy and minimal latency. The system provides distinct visualization contents during in-plane and out-of-plane operations according to their different characteristics. Use case study in this paper proved that our system can help young surgeons perform biopsy faster and more accurately. •Track infrared tools while render remotely for accuracy and system performance.•Simultaneously track multiple infrared tools even when part occluded.•Distinct visualization methods for in-plane and out-of-plane biopsy.•Use case studies adopting different biopsy methods.</description><identifier>ISSN: 0010-4825</identifier><identifier>EISSN: 1879-0534</identifier><identifier>DOI: 10.1016/j.compbiomed.2024.108453</identifier><identifier>PMID: 38636327</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Accuracy ; Augmented Reality ; Biopsy ; Data acquisition ; Humans ; Image acquisition ; Image-Guided Biopsy - instrumentation ; Image-Guided Biopsy - methods ; Latency ; Locatable ultrasound ; Navigation behavior ; Navigation systems ; Spatial data ; Surgeons ; Surgical navigation ; Tracking ; Tumor biopsy ; Ultrasonic imaging ; Ultrasonic testing ; Ultrasonography - instrumentation ; Ultrasonography - methods ; Ultrasound</subject><ispartof>Computers in biology and medicine, 2024-05, Vol.174, p.108453-108453, Article 108453</ispartof><rights>2024 Elsevier Ltd</rights><rights>Copyright © 2024 Elsevier Ltd. All rights reserved.</rights><rights>2024. Elsevier Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c317t-e30d1516a7ddeae5f17fbedb36b71b2169adfa81a289e38442b3206aeb84f0673</citedby><cites>FETCH-LOGICAL-c317t-e30d1516a7ddeae5f17fbedb36b71b2169adfa81a289e38442b3206aeb84f0673</cites><orcidid>0000-0001-5959-9941 ; 0000-0002-5509-0379 ; 0000-0002-4677-1041</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.compbiomed.2024.108453$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>315,781,785,3551,27929,27930,46000</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/38636327$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Li, Haowei</creatorcontrib><creatorcontrib>Yan, Wenqing</creatorcontrib><creatorcontrib>Zhao, Jiasheng</creatorcontrib><creatorcontrib>Ji, Yuqi</creatorcontrib><creatorcontrib>Qian, Long</creatorcontrib><creatorcontrib>Ding, Hui</creatorcontrib><creatorcontrib>Zhao, Zhe</creatorcontrib><creatorcontrib>Wang, Guangzhi</creatorcontrib><title>Navigate biopsy with ultrasound under augmented reality device: Towards higher system performance</title><title>Computers in biology and medicine</title><addtitle>Comput Biol Med</addtitle><description>Biopsies play a crucial role in determining the classification and staging of tumors. Ultrasound is frequently used in this procedure to provide real-time anatomical information. Using augmented reality (AR), surgeons can visualize ultrasound data and spatial navigation information seamlessly integrated with real tissues. This innovation facilitates faster and more precise biopsy operations. We have developed an augmented reality biopsy navigation system characterized by low display latency and high accuracy. Ultrasound data is initially read by an image capture card and streamed to Unity via net communication. In Unity, navigation information is rendered and transmitted to the HoloLens 2 device using holographic remoting. Concurrently, a retro-reflective tool tracking method is implemented on the HoloLens 2, enabling the simultaneous tracking of the ultrasound probe and biopsy needle. Distinct navigation information is provided during in-plane and out-of-plane punctuation. To evaluate the effectiveness of our system, we conducted a study involving ten participants, assessing puncture accuracy and biopsy time in comparison to traditional methods. Ultrasound image was streamed from the ultrasound device to augmented reality headset with 122.49±11.61ms latency, while only 16.22±11.25ms was taken after data acquisition from image capture card. Navigation accuracy reached 1.23±0.68mm in the image plane and 0.95±0.70mm outside the image plane, within a depth range of 200 millimeters. Remarkably, the utilization of our system led to 98% and 95% success rate in out-of-plane and in-plane biopsy, among ten participants with little ultrasound experience. To sum up, this paper introduces an AR-based ultrasound biopsy navigation system characterized by high navigation accuracy and minimal latency. The system provides distinct visualization contents during in-plane and out-of-plane operations according to their different characteristics. Use case study in this paper proved that our system can help young surgeons perform biopsy faster and more accurately. •Track infrared tools while render remotely for accuracy and system performance.•Simultaneously track multiple infrared tools even when part occluded.•Distinct visualization methods for in-plane and out-of-plane biopsy.•Use case studies adopting different biopsy methods.</description><subject>Accuracy</subject><subject>Augmented Reality</subject><subject>Biopsy</subject><subject>Data acquisition</subject><subject>Humans</subject><subject>Image acquisition</subject><subject>Image-Guided Biopsy - instrumentation</subject><subject>Image-Guided Biopsy - methods</subject><subject>Latency</subject><subject>Locatable ultrasound</subject><subject>Navigation behavior</subject><subject>Navigation systems</subject><subject>Spatial data</subject><subject>Surgeons</subject><subject>Surgical navigation</subject><subject>Tracking</subject><subject>Tumor biopsy</subject><subject>Ultrasonic imaging</subject><subject>Ultrasonic testing</subject><subject>Ultrasonography - instrumentation</subject><subject>Ultrasonography - methods</subject><subject>Ultrasound</subject><issn>0010-4825</issn><issn>1879-0534</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNqFkU1r3DAQhkVpaLZp_0IR9JKLt6MPS9re2tAvCM0lPQvZGu9qsS1Xsjfsv4-WTSjk0sMwMPPMB-9LCGWwZsDUp_26jcPUhDigX3PgspSNrMUrsmJGbyqohXxNVgAMKml4fUne5rwHAAkC3pBLYZRQgusVcb_dIWzdjLRsm_KRPoR5R5d-Ti7HZfS0BCbqlu2A44yeJnR9mI_U4yG0-JnexweXfKa7sN0VMB_zjAOdMHUxDW5s8R256Fyf8f1TviJ_vn-7v_lZ3d79-HXz5bZqBdNzhQI8q5ly2nt0WHdMdw36RqhGs4YztXG-c4Y5bjYojJS8ERyUw8bIDpQWV-T6vHdK8e-CebZDyC32vRsxLtkKkAJ0zTUU9OMLdB-XNJbvTpSqNWglC2XOVJtizgk7O6UwuHS0DOzJBru3_2ywJxvs2YYy-uHpwNKces-Dz7oX4OsZwKLIIWCyuQ1Y1PIhYTtbH8P_rzwCGCefVA</recordid><startdate>202405</startdate><enddate>202405</enddate><creator>Li, Haowei</creator><creator>Yan, Wenqing</creator><creator>Zhao, Jiasheng</creator><creator>Ji, Yuqi</creator><creator>Qian, Long</creator><creator>Ding, Hui</creator><creator>Zhao, Zhe</creator><creator>Wang, Guangzhi</creator><general>Elsevier Ltd</general><general>Elsevier Limited</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>K9.</scope><scope>M7Z</scope><scope>NAPCQ</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-5959-9941</orcidid><orcidid>https://orcid.org/0000-0002-5509-0379</orcidid><orcidid>https://orcid.org/0000-0002-4677-1041</orcidid></search><sort><creationdate>202405</creationdate><title>Navigate biopsy with ultrasound under augmented reality device: Towards higher system performance</title><author>Li, Haowei ; Yan, Wenqing ; Zhao, Jiasheng ; Ji, Yuqi ; Qian, Long ; Ding, Hui ; Zhao, Zhe ; Wang, Guangzhi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c317t-e30d1516a7ddeae5f17fbedb36b71b2169adfa81a289e38442b3206aeb84f0673</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Augmented Reality</topic><topic>Biopsy</topic><topic>Data acquisition</topic><topic>Humans</topic><topic>Image acquisition</topic><topic>Image-Guided Biopsy - instrumentation</topic><topic>Image-Guided Biopsy - methods</topic><topic>Latency</topic><topic>Locatable ultrasound</topic><topic>Navigation behavior</topic><topic>Navigation systems</topic><topic>Spatial data</topic><topic>Surgeons</topic><topic>Surgical navigation</topic><topic>Tracking</topic><topic>Tumor biopsy</topic><topic>Ultrasonic imaging</topic><topic>Ultrasonic testing</topic><topic>Ultrasonography - instrumentation</topic><topic>Ultrasonography - methods</topic><topic>Ultrasound</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Haowei</creatorcontrib><creatorcontrib>Yan, Wenqing</creatorcontrib><creatorcontrib>Zhao, Jiasheng</creatorcontrib><creatorcontrib>Ji, Yuqi</creatorcontrib><creatorcontrib>Qian, Long</creatorcontrib><creatorcontrib>Ding, Hui</creatorcontrib><creatorcontrib>Zhao, Zhe</creatorcontrib><creatorcontrib>Wang, Guangzhi</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Biochemistry Abstracts 1</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>Computers in biology and medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Haowei</au><au>Yan, Wenqing</au><au>Zhao, Jiasheng</au><au>Ji, Yuqi</au><au>Qian, Long</au><au>Ding, Hui</au><au>Zhao, Zhe</au><au>Wang, Guangzhi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Navigate biopsy with ultrasound under augmented reality device: Towards higher system performance</atitle><jtitle>Computers in biology and medicine</jtitle><addtitle>Comput Biol Med</addtitle><date>2024-05</date><risdate>2024</risdate><volume>174</volume><spage>108453</spage><epage>108453</epage><pages>108453-108453</pages><artnum>108453</artnum><issn>0010-4825</issn><eissn>1879-0534</eissn><abstract>Biopsies play a crucial role in determining the classification and staging of tumors. Ultrasound is frequently used in this procedure to provide real-time anatomical information. Using augmented reality (AR), surgeons can visualize ultrasound data and spatial navigation information seamlessly integrated with real tissues. This innovation facilitates faster and more precise biopsy operations. We have developed an augmented reality biopsy navigation system characterized by low display latency and high accuracy. Ultrasound data is initially read by an image capture card and streamed to Unity via net communication. In Unity, navigation information is rendered and transmitted to the HoloLens 2 device using holographic remoting. Concurrently, a retro-reflective tool tracking method is implemented on the HoloLens 2, enabling the simultaneous tracking of the ultrasound probe and biopsy needle. Distinct navigation information is provided during in-plane and out-of-plane punctuation. To evaluate the effectiveness of our system, we conducted a study involving ten participants, assessing puncture accuracy and biopsy time in comparison to traditional methods. Ultrasound image was streamed from the ultrasound device to augmented reality headset with 122.49±11.61ms latency, while only 16.22±11.25ms was taken after data acquisition from image capture card. Navigation accuracy reached 1.23±0.68mm in the image plane and 0.95±0.70mm outside the image plane, within a depth range of 200 millimeters. Remarkably, the utilization of our system led to 98% and 95% success rate in out-of-plane and in-plane biopsy, among ten participants with little ultrasound experience. To sum up, this paper introduces an AR-based ultrasound biopsy navigation system characterized by high navigation accuracy and minimal latency. The system provides distinct visualization contents during in-plane and out-of-plane operations according to their different characteristics. Use case study in this paper proved that our system can help young surgeons perform biopsy faster and more accurately. •Track infrared tools while render remotely for accuracy and system performance.•Simultaneously track multiple infrared tools even when part occluded.•Distinct visualization methods for in-plane and out-of-plane biopsy.•Use case studies adopting different biopsy methods.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>38636327</pmid><doi>10.1016/j.compbiomed.2024.108453</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0001-5959-9941</orcidid><orcidid>https://orcid.org/0000-0002-5509-0379</orcidid><orcidid>https://orcid.org/0000-0002-4677-1041</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0010-4825
ispartof Computers in biology and medicine, 2024-05, Vol.174, p.108453-108453, Article 108453
issn 0010-4825
1879-0534
language eng
recordid cdi_proquest_miscellaneous_3043075270
source MEDLINE; Access via ScienceDirect (Elsevier)
subjects Accuracy
Augmented Reality
Biopsy
Data acquisition
Humans
Image acquisition
Image-Guided Biopsy - instrumentation
Image-Guided Biopsy - methods
Latency
Locatable ultrasound
Navigation behavior
Navigation systems
Spatial data
Surgeons
Surgical navigation
Tracking
Tumor biopsy
Ultrasonic imaging
Ultrasonic testing
Ultrasonography - instrumentation
Ultrasonography - methods
Ultrasound
title Navigate biopsy with ultrasound under augmented reality device: Towards higher system performance
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-16T08%3A45%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Navigate%20biopsy%20with%20ultrasound%20under%20augmented%20reality%20device:%20Towards%20higher%20system%20performance&rft.jtitle=Computers%20in%20biology%20and%20medicine&rft.au=Li,%20Haowei&rft.date=2024-05&rft.volume=174&rft.spage=108453&rft.epage=108453&rft.pages=108453-108453&rft.artnum=108453&rft.issn=0010-4825&rft.eissn=1879-0534&rft_id=info:doi/10.1016/j.compbiomed.2024.108453&rft_dat=%3Cproquest_cross%3E3046570764%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3046570764&rft_id=info:pmid/38636327&rft_els_id=S0010482524005377&rfr_iscdi=true