Surgical Instrument Tracking By Multiple Monocular Modules and a Sensor Fusion Approach

This paper presents a sensor fusion-based surgical instrument tracking system which uses multiple monocular modules. The system is an optical tracking system, which has been widely utilized in the image-guide surgery because of its high accuracy and precision. However, the line-of-sight occlusion pr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on automation science and engineering 2019-04, Vol.16 (2), p.629-639
Hauptverfasser: Wang, Jiaole, Song, Shuang, Ren, Hongliang, Lim, Chwee Ming, Meng, Max Q.-H.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents a sensor fusion-based surgical instrument tracking system which uses multiple monocular modules. The system is an optical tracking system, which has been widely utilized in the image-guide surgery because of its high accuracy and precision. However, the line-of-sight occlusion problem which remains unresolved in current systems frustrates surgeons during the operation. To address this challenge, we propose a surgical instrument tracking system based on multiple monocular modules. The rationale is to enable the system to track the surgical instruments inside the surgical site from different views. Three sensor fusion algorithms are proposed to integrate all sensor data from the multimodule system. In order to show the feasibility of the tracking system, simulations and comparison experiments have been carried out. The intensive investigation results give a practical instruction to the real implementation of the proposed system in image-guided interventions. Moreover, an image-guided surgical trial by using a cadaver head has been carried out to validate the feasibility of the proposed system and the tracking algorithms. The results from both the simulation and the cadaver trial have shown the effectiveness of the proposed robust fusion algorithm.
ISSN:1545-5955
1558-3783
DOI:10.1109/TASE.2018.2848239