Augmented reality visualisation for orthopaedic surgical guidance with pre- and intra-operative multimodal image data fusion

Augmented reality (AR) has proven to be a useful, exciting technology in several areas of healthcare. AR may especially enhance the operator's experience in minimally invasive surgical applications by providing more intuitive and naturally immersive visualisation in those procedures which heavi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Healthcare technology letters 2018-10, Vol.5 (5), p.189-193
Hauptverfasser: El-Hariri, Houssam, Pandey, Prashant, Hodgson, Antony J, Garbi, Rafeef
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Augmented reality (AR) has proven to be a useful, exciting technology in several areas of healthcare. AR may especially enhance the operator's experience in minimally invasive surgical applications by providing more intuitive and naturally immersive visualisation in those procedures which heavily rely on three-dimensional (3D) imaging data. Benefits include improved operator ergonomics, reduced fatigue, and simplified hand–eye coordination. Head-mounted AR displays may hold great potential for enhancing surgical navigation given their compactness and intuitiveness of use. In this work, the authors propose a method that can intra-operatively locate bone structures using tracked ultrasound (US), registers to the corresponding pre-operative computed tomography (CT) data and generates 3D AR visualisation of the operated surgical scene through a head-mounted display. The proposed method deploys optically-tracked US, bone surface segmentation from the US and CT image volumes, and multimodal volume registration to align pre-operative to the corresponding intra-operative data. The enhanced surgical scene is then visualised in an AR framework using a HoloLens. They demonstrate the method's utility using a foam pelvis phantom and quantitatively assess accuracy by comparing the locations of fiducial markers in the real and virtual spaces, yielding root mean square errors of 3.22, 22.46, and 28.30 mm in the x, y, and z directions, respectively.
ISSN:2053-3713
2053-3713
DOI:10.1049/htl.2018.5061