Augmented reality visualisation for orthopaedic surgical guidance with pre- and intra-operative multimodal image data fusion

Augmented reality (AR) has proven to be a useful, exciting technology in several areas of healthcare. AR may especially enhance the operator's experience in minimally invasive surgical applications by providing more intuitive and naturally immersive visualisation in those procedures which heavi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Healthcare technology letters 2018-10, Vol.5 (5), p.189-193
Hauptverfasser: El-Hariri, Houssam, Pandey, Prashant, Hodgson, Antony J, Garbi, Rafeef
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 193
container_issue 5
container_start_page 189
container_title Healthcare technology letters
container_volume 5
creator El-Hariri, Houssam
Pandey, Prashant
Hodgson, Antony J
Garbi, Rafeef
description Augmented reality (AR) has proven to be a useful, exciting technology in several areas of healthcare. AR may especially enhance the operator's experience in minimally invasive surgical applications by providing more intuitive and naturally immersive visualisation in those procedures which heavily rely on three-dimensional (3D) imaging data. Benefits include improved operator ergonomics, reduced fatigue, and simplified hand–eye coordination. Head-mounted AR displays may hold great potential for enhancing surgical navigation given their compactness and intuitiveness of use. In this work, the authors propose a method that can intra-operatively locate bone structures using tracked ultrasound (US), registers to the corresponding pre-operative computed tomography (CT) data and generates 3D AR visualisation of the operated surgical scene through a head-mounted display. The proposed method deploys optically-tracked US, bone surface segmentation from the US and CT image volumes, and multimodal volume registration to align pre-operative to the corresponding intra-operative data. The enhanced surgical scene is then visualised in an AR framework using a HoloLens. They demonstrate the method's utility using a foam pelvis phantom and quantitatively assess accuracy by comparing the locations of fiducial markers in the real and virtual spaces, yielding root mean square errors of 3.22, 22.46, and 28.30 mm in the x, y, and z directions, respectively.
doi_str_mv 10.1049/htl.2018.5061
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1049_htl_2018_5061</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_cefbcd32a28a4b8bb6762dd737906625</doaj_id><sourcerecordid>3090589801</sourcerecordid><originalsourceid>FETCH-LOGICAL-c4817-d0d9a8a9b47218f93d3ecf40a4851742b8c8fe342aca9349615f4372d741dc1d3</originalsourceid><addsrcrecordid>eNp9kc1r3DAQxU1poSHNsXdB6aEHb_RlSz6moWkCC72kZzHWh1eL13IlOWGhf3y1dQl7CD3NMPzemwevqj4SvCGYd9e7PG4oJnLT4Ja8qS4obljNBGFvz_b31VVKe4wxaRveUnJR_b5ZhoOdsjUoWhh9PqInn5ayJcg-TMiFiELMuzCDNV6jtMTBaxjRsHgDk7bo2ecdmqOtEUwG-SlHqMNsY9E_WXRYxuwPwRSFP8BgkYEMyC2pmH-o3jkYk736Ny-rn3ffHm_v6-2P7w-3N9tac0lEbbDpQELXc0GJdB0zzGrHMXDZEMFpL7V0lnEKGjrGu5Y0jjNBjeDEaGLYZfWw-poAezXHEiQeVQCv_h5CHBTE7PVolbau14ZRoBJ4L_u-FS01RjDR4balTfH6tHrNMfxabMpqH5Y4lfiK4Q43spOYFKpeKR1DStG6l68Eq1NfqvSlTn2pU1-Fb1f-2Y_2-H9Y3T9u6de7U4miCD-vQm_PkhTijJ-NK9yXV7jXw_wBkVaz8Q</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3090589801</pqid></control><display><type>article</type><title>Augmented reality visualisation for orthopaedic surgical guidance with pre- and intra-operative multimodal image data fusion</title><source>Wiley Journals</source><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><source>Wiley Online Library (Open Access Collection)</source><source>PubMed Central</source><creator>El-Hariri, Houssam ; Pandey, Prashant ; Hodgson, Antony J ; Garbi, Rafeef</creator><creatorcontrib>El-Hariri, Houssam ; Pandey, Prashant ; Hodgson, Antony J ; Garbi, Rafeef</creatorcontrib><description>Augmented reality (AR) has proven to be a useful, exciting technology in several areas of healthcare. AR may especially enhance the operator's experience in minimally invasive surgical applications by providing more intuitive and naturally immersive visualisation in those procedures which heavily rely on three-dimensional (3D) imaging data. Benefits include improved operator ergonomics, reduced fatigue, and simplified hand–eye coordination. Head-mounted AR displays may hold great potential for enhancing surgical navigation given their compactness and intuitiveness of use. In this work, the authors propose a method that can intra-operatively locate bone structures using tracked ultrasound (US), registers to the corresponding pre-operative computed tomography (CT) data and generates 3D AR visualisation of the operated surgical scene through a head-mounted display. The proposed method deploys optically-tracked US, bone surface segmentation from the US and CT image volumes, and multimodal volume registration to align pre-operative to the corresponding intra-operative data. The enhanced surgical scene is then visualised in an AR framework using a HoloLens. They demonstrate the method's utility using a foam pelvis phantom and quantitatively assess accuracy by comparing the locations of fiducial markers in the real and virtual spaces, yielding root mean square errors of 3.22, 22.46, and 28.30 mm in the x, y, and z directions, respectively.</description><identifier>ISSN: 2053-3713</identifier><identifier>EISSN: 2053-3713</identifier><identifier>DOI: 10.1049/htl.2018.5061</identifier><language>eng</language><publisher>Warwick: The Institution of Engineering and Technology</publisher><subject>3D AR visualisation ; Augmented reality ; augmented reality visualisation ; biomedical ultrasonics ; bone ; bone structures ; bone surface segmentation ; Calibration ; computerised tomography ; corresponding intra-operative data ; CT image volumes ; data visualisation ; enhanced surgical scene ; fiducial marker locations ; foam pelvis phantom ; head-mounted AR displays ; healthcare ; helmet mounted displays ; HoloLens ; image fusion ; image registration ; image segmentation ; intraoperative multimodal image data fusion ; intuitive visualisation ; Localization ; medical image processing ; minimally invasive surgical applications ; multimodal volume registration ; naturally immersive visualisation ; operated surgical scene ; operator ergonomics ; optically-tracked US ; orthopaedic surgical guidance ; orthopaedics ; Orthopedics ; phantoms ; preoperative computed tomography data ; reduced fatigue ; Registration ; root mean square errors ; Sensors ; simplified hand-eye coordination ; Special Issue: Papers from the 12th Workshop on Augmented Environments for Computer-Assisted Interventions ; Surgeons ; Surgery ; surgical navigation ; three-dimensional imaging data ; tracked ultrasound</subject><ispartof>Healthcare technology letters, 2018-10, Vol.5 (5), p.189-193</ispartof><rights>2018 Healthcare Technology Letters published by John Wiley &amp; Sons Ltd on behalf of The Institution of Engineering and Technology.</rights><rights>2018. This work is published under http://creativecommons.org/licenses/by-nc-nd/3.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c4817-d0d9a8a9b47218f93d3ecf40a4851742b8c8fe342aca9349615f4372d741dc1d3</citedby><cites>FETCH-LOGICAL-c4817-d0d9a8a9b47218f93d3ecf40a4851742b8c8fe342aca9349615f4372d741dc1d3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1049%2Fhtl.2018.5061$$EPDF$$P50$$Gwiley$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1049%2Fhtl.2018.5061$$EHTML$$P50$$Gwiley$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,1417,2102,11562,27924,27925,45574,45575,46052,46476</link.rule.ids></links><search><creatorcontrib>El-Hariri, Houssam</creatorcontrib><creatorcontrib>Pandey, Prashant</creatorcontrib><creatorcontrib>Hodgson, Antony J</creatorcontrib><creatorcontrib>Garbi, Rafeef</creatorcontrib><title>Augmented reality visualisation for orthopaedic surgical guidance with pre- and intra-operative multimodal image data fusion</title><title>Healthcare technology letters</title><description>Augmented reality (AR) has proven to be a useful, exciting technology in several areas of healthcare. AR may especially enhance the operator's experience in minimally invasive surgical applications by providing more intuitive and naturally immersive visualisation in those procedures which heavily rely on three-dimensional (3D) imaging data. Benefits include improved operator ergonomics, reduced fatigue, and simplified hand–eye coordination. Head-mounted AR displays may hold great potential for enhancing surgical navigation given their compactness and intuitiveness of use. In this work, the authors propose a method that can intra-operatively locate bone structures using tracked ultrasound (US), registers to the corresponding pre-operative computed tomography (CT) data and generates 3D AR visualisation of the operated surgical scene through a head-mounted display. The proposed method deploys optically-tracked US, bone surface segmentation from the US and CT image volumes, and multimodal volume registration to align pre-operative to the corresponding intra-operative data. The enhanced surgical scene is then visualised in an AR framework using a HoloLens. They demonstrate the method's utility using a foam pelvis phantom and quantitatively assess accuracy by comparing the locations of fiducial markers in the real and virtual spaces, yielding root mean square errors of 3.22, 22.46, and 28.30 mm in the x, y, and z directions, respectively.</description><subject>3D AR visualisation</subject><subject>Augmented reality</subject><subject>augmented reality visualisation</subject><subject>biomedical ultrasonics</subject><subject>bone</subject><subject>bone structures</subject><subject>bone surface segmentation</subject><subject>Calibration</subject><subject>computerised tomography</subject><subject>corresponding intra-operative data</subject><subject>CT image volumes</subject><subject>data visualisation</subject><subject>enhanced surgical scene</subject><subject>fiducial marker locations</subject><subject>foam pelvis phantom</subject><subject>head-mounted AR displays</subject><subject>healthcare</subject><subject>helmet mounted displays</subject><subject>HoloLens</subject><subject>image fusion</subject><subject>image registration</subject><subject>image segmentation</subject><subject>intraoperative multimodal image data fusion</subject><subject>intuitive visualisation</subject><subject>Localization</subject><subject>medical image processing</subject><subject>minimally invasive surgical applications</subject><subject>multimodal volume registration</subject><subject>naturally immersive visualisation</subject><subject>operated surgical scene</subject><subject>operator ergonomics</subject><subject>optically-tracked US</subject><subject>orthopaedic surgical guidance</subject><subject>orthopaedics</subject><subject>Orthopedics</subject><subject>phantoms</subject><subject>preoperative computed tomography data</subject><subject>reduced fatigue</subject><subject>Registration</subject><subject>root mean square errors</subject><subject>Sensors</subject><subject>simplified hand-eye coordination</subject><subject>Special Issue: Papers from the 12th Workshop on Augmented Environments for Computer-Assisted Interventions</subject><subject>Surgeons</subject><subject>Surgery</subject><subject>surgical navigation</subject><subject>three-dimensional imaging data</subject><subject>tracked ultrasound</subject><issn>2053-3713</issn><issn>2053-3713</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>24P</sourceid><sourceid>WIN</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>DOA</sourceid><recordid>eNp9kc1r3DAQxU1poSHNsXdB6aEHb_RlSz6moWkCC72kZzHWh1eL13IlOWGhf3y1dQl7CD3NMPzemwevqj4SvCGYd9e7PG4oJnLT4Ja8qS4obljNBGFvz_b31VVKe4wxaRveUnJR_b5ZhoOdsjUoWhh9PqInn5ayJcg-TMiFiELMuzCDNV6jtMTBaxjRsHgDk7bo2ecdmqOtEUwG-SlHqMNsY9E_WXRYxuwPwRSFP8BgkYEMyC2pmH-o3jkYk736Ny-rn3ffHm_v6-2P7w-3N9tac0lEbbDpQELXc0GJdB0zzGrHMXDZEMFpL7V0lnEKGjrGu5Y0jjNBjeDEaGLYZfWw-poAezXHEiQeVQCv_h5CHBTE7PVolbau14ZRoBJ4L_u-FS01RjDR4balTfH6tHrNMfxabMpqH5Y4lfiK4Q43spOYFKpeKR1DStG6l68Eq1NfqvSlTn2pU1-Fb1f-2Y_2-H9Y3T9u6de7U4miCD-vQm_PkhTijJ-NK9yXV7jXw_wBkVaz8Q</recordid><startdate>201810</startdate><enddate>201810</enddate><creator>El-Hariri, Houssam</creator><creator>Pandey, Prashant</creator><creator>Hodgson, Antony J</creator><creator>Garbi, Rafeef</creator><general>The Institution of Engineering and Technology</general><general>John Wiley &amp; Sons, Inc</general><general>Wiley</general><scope>IDLOA</scope><scope>24P</scope><scope>WIN</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>DOA</scope></search><sort><creationdate>201810</creationdate><title>Augmented reality visualisation for orthopaedic surgical guidance with pre- and intra-operative multimodal image data fusion</title><author>El-Hariri, Houssam ; Pandey, Prashant ; Hodgson, Antony J ; Garbi, Rafeef</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c4817-d0d9a8a9b47218f93d3ecf40a4851742b8c8fe342aca9349615f4372d741dc1d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>3D AR visualisation</topic><topic>Augmented reality</topic><topic>augmented reality visualisation</topic><topic>biomedical ultrasonics</topic><topic>bone</topic><topic>bone structures</topic><topic>bone surface segmentation</topic><topic>Calibration</topic><topic>computerised tomography</topic><topic>corresponding intra-operative data</topic><topic>CT image volumes</topic><topic>data visualisation</topic><topic>enhanced surgical scene</topic><topic>fiducial marker locations</topic><topic>foam pelvis phantom</topic><topic>head-mounted AR displays</topic><topic>healthcare</topic><topic>helmet mounted displays</topic><topic>HoloLens</topic><topic>image fusion</topic><topic>image registration</topic><topic>image segmentation</topic><topic>intraoperative multimodal image data fusion</topic><topic>intuitive visualisation</topic><topic>Localization</topic><topic>medical image processing</topic><topic>minimally invasive surgical applications</topic><topic>multimodal volume registration</topic><topic>naturally immersive visualisation</topic><topic>operated surgical scene</topic><topic>operator ergonomics</topic><topic>optically-tracked US</topic><topic>orthopaedic surgical guidance</topic><topic>orthopaedics</topic><topic>Orthopedics</topic><topic>phantoms</topic><topic>preoperative computed tomography data</topic><topic>reduced fatigue</topic><topic>Registration</topic><topic>root mean square errors</topic><topic>Sensors</topic><topic>simplified hand-eye coordination</topic><topic>Special Issue: Papers from the 12th Workshop on Augmented Environments for Computer-Assisted Interventions</topic><topic>Surgeons</topic><topic>Surgery</topic><topic>surgical navigation</topic><topic>three-dimensional imaging data</topic><topic>tracked ultrasound</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>El-Hariri, Houssam</creatorcontrib><creatorcontrib>Pandey, Prashant</creatorcontrib><creatorcontrib>Hodgson, Antony J</creatorcontrib><creatorcontrib>Garbi, Rafeef</creatorcontrib><collection>IET Digital Library (Open Access)</collection><collection>Wiley Online Library (Open Access Collection)</collection><collection>Wiley Online Library (Open Access Collection)</collection><collection>CrossRef</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Healthcare technology letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>El-Hariri, Houssam</au><au>Pandey, Prashant</au><au>Hodgson, Antony J</au><au>Garbi, Rafeef</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Augmented reality visualisation for orthopaedic surgical guidance with pre- and intra-operative multimodal image data fusion</atitle><jtitle>Healthcare technology letters</jtitle><date>2018-10</date><risdate>2018</risdate><volume>5</volume><issue>5</issue><spage>189</spage><epage>193</epage><pages>189-193</pages><issn>2053-3713</issn><eissn>2053-3713</eissn><abstract>Augmented reality (AR) has proven to be a useful, exciting technology in several areas of healthcare. AR may especially enhance the operator's experience in minimally invasive surgical applications by providing more intuitive and naturally immersive visualisation in those procedures which heavily rely on three-dimensional (3D) imaging data. Benefits include improved operator ergonomics, reduced fatigue, and simplified hand–eye coordination. Head-mounted AR displays may hold great potential for enhancing surgical navigation given their compactness and intuitiveness of use. In this work, the authors propose a method that can intra-operatively locate bone structures using tracked ultrasound (US), registers to the corresponding pre-operative computed tomography (CT) data and generates 3D AR visualisation of the operated surgical scene through a head-mounted display. The proposed method deploys optically-tracked US, bone surface segmentation from the US and CT image volumes, and multimodal volume registration to align pre-operative to the corresponding intra-operative data. The enhanced surgical scene is then visualised in an AR framework using a HoloLens. They demonstrate the method's utility using a foam pelvis phantom and quantitatively assess accuracy by comparing the locations of fiducial markers in the real and virtual spaces, yielding root mean square errors of 3.22, 22.46, and 28.30 mm in the x, y, and z directions, respectively.</abstract><cop>Warwick</cop><pub>The Institution of Engineering and Technology</pub><doi>10.1049/htl.2018.5061</doi><tpages>5</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2053-3713
ispartof Healthcare technology letters, 2018-10, Vol.5 (5), p.189-193
issn 2053-3713
2053-3713
language eng
recordid cdi_crossref_primary_10_1049_htl_2018_5061
source Wiley Journals; DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals; Wiley Online Library (Open Access Collection); PubMed Central
subjects 3D AR visualisation
Augmented reality
augmented reality visualisation
biomedical ultrasonics
bone
bone structures
bone surface segmentation
Calibration
computerised tomography
corresponding intra-operative data
CT image volumes
data visualisation
enhanced surgical scene
fiducial marker locations
foam pelvis phantom
head-mounted AR displays
healthcare
helmet mounted displays
HoloLens
image fusion
image registration
image segmentation
intraoperative multimodal image data fusion
intuitive visualisation
Localization
medical image processing
minimally invasive surgical applications
multimodal volume registration
naturally immersive visualisation
operated surgical scene
operator ergonomics
optically-tracked US
orthopaedic surgical guidance
orthopaedics
Orthopedics
phantoms
preoperative computed tomography data
reduced fatigue
Registration
root mean square errors
Sensors
simplified hand-eye coordination
Special Issue: Papers from the 12th Workshop on Augmented Environments for Computer-Assisted Interventions
Surgeons
Surgery
surgical navigation
three-dimensional imaging data
tracked ultrasound
title Augmented reality visualisation for orthopaedic surgical guidance with pre- and intra-operative multimodal image data fusion
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T05%3A39%3A17IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Augmented%20reality%20visualisation%20for%20orthopaedic%20surgical%20guidance%20with%20pre-%20and%20intra-operative%20multimodal%20image%20data%20fusion&rft.jtitle=Healthcare%20technology%20letters&rft.au=El-Hariri,%20Houssam&rft.date=2018-10&rft.volume=5&rft.issue=5&rft.spage=189&rft.epage=193&rft.pages=189-193&rft.issn=2053-3713&rft.eissn=2053-3713&rft_id=info:doi/10.1049/htl.2018.5061&rft_dat=%3Cproquest_cross%3E3090589801%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3090589801&rft_id=info:pmid/&rft_doaj_id=oai_doaj_org_article_cefbcd32a28a4b8bb6762dd737906625&rfr_iscdi=true