Augmented Reality for MR-guided surgery

Intra-operative Magnetic Resonance Imaging is a new modality for image-guided therapy, and Augmented Reality (AR) is an important emerging technology in this field. AR enables the development of tools which can be applied both pre-operatively and intra-operatively, thus helping users to see into the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Karlsen Jørn Skaarud , Norges teknisk-naturvitenskapelige universitet, Institutt for datateknikk og informasjonsvitenskap, Karlsen Jørn Skaarud
Format: Web Resource
Sprache:eng ; swe
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Intra-operative Magnetic Resonance Imaging is a new modality for image-guided therapy, and Augmented Reality (AR) is an important emerging technology in this field. AR enables the development of tools which can be applied both pre-operatively and intra-operatively, thus helping users to see into the body, through organs and visualize the relevant parts useful for a specific procedure. The work presented in this paper aims at solving several problems in order to develop an Augmented Reality system for real-life surgery in an MR environment. Specifically, ways of correctly registering 3D-imagery with the real world is the major problem of both Augmented Reality and this thesis. Emphasis is put on the static registration problem. Subproblems of this include: calibrating a video-see-through Head Mounted Display (HMD) entirely in Augmented Reality, registering a virtual object on a patient by placing a set of points on both the virtual object and patient, and calculating the transformation needed in order for two overlapping tracking systems to deliver tracking signals in the same coordinate system. Additionally, problems and solutions related to the visualization of volume data and internal organs are presented: Specifically, how to view virtual organs as if they were residing inside the body of a patient through a cut, thought no surgical opening of the body has been performed, and the visualization and manipulation of a volume transfer function in a real-time Augmented Reality setting. Implementations use the Studierstube and OpenTracker software frameworks for visualization and abstraction of tracking devices respectively. OpenCV, a computer vision library, is used for image processing and calibraton together with an implementation of Tsai's calibration method by Reg Willson. The Augmented Reality based calibration implementation uses two different calibration methods, referred to in litterature as Zhang and Tsai camera calibration, for calibrating the intrinsic and extrinsic camera parameters respectively. Registering virtual-real objects and overlapping tracking systems is performed using a simplified version of the Iterative Closest Point (ICP) procedure solving a problem commonly referred to as the absolute orientation problem. The virtual-cut implementation works by projecting a rendered texture of a virtual organ and mapping this to a mesh representation of a cut which is placed on the patient in Augmented Reality. The volume transfer functions are im