Gaze-contingent perceptually enabled interactions in the operating theatre
Purpose Improved surgical outcome and patient safety in the operating theatre are constant challenges. We hypothesise that a framework that collects and utilises information —especially perceptually enabled ones—from multiple sources, could help to meet the above goals. This paper presents some core...
Gespeichert in:
Veröffentlicht in: | International journal for computer assisted radiology and surgery 2017-07, Vol.12 (7), p.1131-1140 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Purpose
Improved surgical outcome and patient safety in the operating theatre are constant challenges. We hypothesise that a framework that collects and utilises information —especially perceptually enabled ones—from multiple sources, could help to meet the above goals. This paper presents some core functionalities of a wider low-cost framework under development that allows perceptually enabled interaction within the surgical environment.
Methods
The synergy of wearable eye-tracking and advanced computer vision methodologies, such as SLAM, is exploited. As a demonstration of one of the framework’s possible functionalities, an articulated collaborative robotic arm and laser pointer is integrated and the set-up is used to project the surgeon’s fixation point in 3D space.
Results
The implementation is evaluated over 60 fixations on predefined targets, with distances between the subject and the targets of 92–212 cm and between the robot and the targets of 42–193 cm. The median overall system error is currently 3.98 cm. Its real-time potential is also highlighted.
Conclusions
The work presented here represents an introduction and preliminary experimental validation of core functionalities of a larger framework under development. The proposed framework is geared towards a safer and more efficient surgical theatre. |
---|---|
ISSN: | 1861-6410 1861-6429 |
DOI: | 10.1007/s11548-017-1580-y |