Surgical Tools Detection Based on Modulated Anchoring Network in Laparoscopic Videos

Minimally invasive surgery like laparoscopic surgery is an active research area of clinical practice for less pain and a faster recovery rate. Detection of surgical tools with more accurate spatial locations in surgical videos not only helps to ensure patient safety by reducing the incidence of comp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2020, Vol.8, p.23748-23758
Hauptverfasser: Zhang, Beibei, Wang, Shengsheng, Dong, Liyan, Chen, Peng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Minimally invasive surgery like laparoscopic surgery is an active research area of clinical practice for less pain and a faster recovery rate. Detection of surgical tools with more accurate spatial locations in surgical videos not only helps to ensure patient safety by reducing the incidence of complications but also makes a difference to assess the surgeon performance. In this paper, we propose a novel Modulated Anchoring Network for detection of laparoscopic surgery tools based on Faster R-CNN, which inherits the merits of two-stage approaches while also maintains high efficiency of comparable speed as state-of-the-art one-stage methods. Since objects like surgical instruments with a wide aspect ratio are difficult to recognize, we develop a novel training scheme named as modulated anchoring to explicitly predict arbitrary anchor shapes of objects of interest. For taking the relationship of different tools into consideration, it is useful to embed the relation module in our network. We evaluate our method using an existing dataset (m2cai16-tool-locations) and a new private dataset (AJU-Set), both collected from cholecystectomy surgical videos in hospital, covering information of seven surgical tools with spatial bounds. We show that our detector yields excellent detection accuracy of 69.6% and 76.5% over the introduced datasets superior to other recently used architectures. We further verify the efficiency of our method by analyzing the usage patterns of tools, the economy of the movement, and the dexterity of operations to assess surgical quality.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.2969885