Assessment of surgical skills by using surgical navigation in robot-assisted partial nephrectomy
Purpose To assess surgical skills in robot-assisted partial nephrectomy (RAPN) with and without surgical navigation (SN). Methods We employed an SN system that synchronizes the real-time endoscopic image with a virtual reality three-dimensional (3D) model for RAPN and evaluated the skills of two exp...
Gespeichert in:
Veröffentlicht in: | International journal for computer assisted radiology and surgery 2019-08, Vol.14 (8), p.1449-1459 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1459 |
---|---|
container_issue | 8 |
container_start_page | 1449 |
container_title | International journal for computer assisted radiology and surgery |
container_volume | 14 |
creator | Kobayashi, Satoshi Cho, Byunghyun Huaulmé, Arnaud Tatsugami, Katsunori Honda, Hiroshi Jannin, Pierre Hashizumea, Makoto Eto, Masatoshi |
description | Purpose
To assess surgical skills in robot-assisted partial nephrectomy (RAPN) with and without surgical navigation (SN).
Methods
We employed an SN system that synchronizes the real-time endoscopic image with a virtual reality three-dimensional (3D) model for RAPN and evaluated the skills of two expert surgeons with regard to the identification and dissection of the renal artery (non-SN group,
n
= 21 [first surgeon
n
= 9, second surgeon
n
= 12]; SN group,
n
= 32 [first surgeon
n
= 11, second surgeon
n
= 21]). We converted all movements of the robotic forceps during RAPN into a dedicated vocabulary. Using RAPN videos, we classified all movements of the robotic forceps into direct action (defined as movements of the robotic forceps that directly affect tissues) and connected motion (defined as movements that link actions). In addition, we analyzed the frequency, duration, and occupancy rate of the connected motion.
Results
In the SN group, the R.E.N.A.L nephrometry score was lower (7 vs. 6,
P
= 0.019) and the time to identify and dissect the renal artery (16 vs. 9 min,
P
= 0.008) was significantly shorter. The connected motions of inefficient “insert,” “pull,” and “rotate” motions were significantly improved by SN. SN significantly improved the frequency, duration, and occupancy rate of connected motions of the right hand of the first surgeon and of both hands of the second surgeon. The improvements in connected motions were positively associated with SN for both surgeons.
Conclusion
This is the first study to investigate SN for nephron-sparing surgery. SN with 3D models might help improve the connected motions of expert surgeons to ensure efficient RAPN. |
doi_str_mv | 10.1007/s11548-019-01980-8 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2232120085</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2232120085</sourcerecordid><originalsourceid>FETCH-LOGICAL-c569t-f9370de755f2d0b1a214c7d88ea63a04c858cb5ca87ab67396cf0d41ef3dd1a73</originalsourceid><addsrcrecordid>eNp9kD1PwzAQhi0EoqXwBxhQJBaWgC-OHWesKr6kSiwwG8dxiksSF1-C1H9PSguVGBiss3TPvXd6CDkHeg2UZjcIwFMZU8g3T9JYHpAxSAGxSJP88PcPdEROEJeUpjxj_JiMGADkqRRj8jpFtIiNbbvIVxH2YeGMriN8d3WNUbGOenTtYt9o9adb6M75NnJtFHzhu1gjOuxsGa106NwGsqu3YE3nm_UpOap0jfZsVyfk5e72efYQz5_uH2fTeWy4yLu4yllGS5txXiUlLUAnkJqslNJqwTRNjeTSFNxomelCZCwXpqJlCrZiZQk6YxNytc1dBf_RW-xU49DYutat9T2qJGEJJJRKPqCXf9Cl70M7XDdQXIDIaQoDlWwpEzxisJVaBdfosFZA1ca_2vpXg3v17V_JYehiF90XjS1_R36EDwDbAji02oUN-93_xH4BqsWR2A</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2256169041</pqid></control><display><type>article</type><title>Assessment of surgical skills by using surgical navigation in robot-assisted partial nephrectomy</title><source>SpringerNature Journals</source><creator>Kobayashi, Satoshi ; Cho, Byunghyun ; Huaulmé, Arnaud ; Tatsugami, Katsunori ; Honda, Hiroshi ; Jannin, Pierre ; Hashizumea, Makoto ; Eto, Masatoshi</creator><creatorcontrib>Kobayashi, Satoshi ; Cho, Byunghyun ; Huaulmé, Arnaud ; Tatsugami, Katsunori ; Honda, Hiroshi ; Jannin, Pierre ; Hashizumea, Makoto ; Eto, Masatoshi</creatorcontrib><description>Purpose
To assess surgical skills in robot-assisted partial nephrectomy (RAPN) with and without surgical navigation (SN).
Methods
We employed an SN system that synchronizes the real-time endoscopic image with a virtual reality three-dimensional (3D) model for RAPN and evaluated the skills of two expert surgeons with regard to the identification and dissection of the renal artery (non-SN group,
n
= 21 [first surgeon
n
= 9, second surgeon
n
= 12]; SN group,
n
= 32 [first surgeon
n
= 11, second surgeon
n
= 21]). We converted all movements of the robotic forceps during RAPN into a dedicated vocabulary. Using RAPN videos, we classified all movements of the robotic forceps into direct action (defined as movements of the robotic forceps that directly affect tissues) and connected motion (defined as movements that link actions). In addition, we analyzed the frequency, duration, and occupancy rate of the connected motion.
Results
In the SN group, the R.E.N.A.L nephrometry score was lower (7 vs. 6,
P
= 0.019) and the time to identify and dissect the renal artery (16 vs. 9 min,
P
= 0.008) was significantly shorter. The connected motions of inefficient “insert,” “pull,” and “rotate” motions were significantly improved by SN. SN significantly improved the frequency, duration, and occupancy rate of connected motions of the right hand of the first surgeon and of both hands of the second surgeon. The improvements in connected motions were positively associated with SN for both surgeons.
Conclusion
This is the first study to investigate SN for nephron-sparing surgery. SN with 3D models might help improve the connected motions of expert surgeons to ensure efficient RAPN.</description><identifier>ISSN: 1861-6410</identifier><identifier>EISSN: 1861-6429</identifier><identifier>DOI: 10.1007/s11548-019-01980-8</identifier><identifier>PMID: 31119486</identifier><language>eng</language><publisher>Cham: Springer International Publishing</publisher><subject>Computer Imaging ; Computer Science ; Frequency analysis ; Health Informatics ; Imaging ; Medical instruments ; Medicine ; Medicine & Public Health ; Occupancy ; Original Article ; Pattern Recognition and Graphics ; Radiology ; Robotics ; Robots ; Skills ; Surgeons ; Surgery ; Three dimensional models ; Time synchronization ; Virtual reality ; Vision</subject><ispartof>International journal for computer assisted radiology and surgery, 2019-08, Vol.14 (8), p.1449-1459</ispartof><rights>CARS 2019</rights><rights>Copyright Springer Nature B.V. 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c569t-f9370de755f2d0b1a214c7d88ea63a04c858cb5ca87ab67396cf0d41ef3dd1a73</citedby><cites>FETCH-LOGICAL-c569t-f9370de755f2d0b1a214c7d88ea63a04c858cb5ca87ab67396cf0d41ef3dd1a73</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11548-019-01980-8$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11548-019-01980-8$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>315,782,786,27933,27934,41497,42566,51328</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/31119486$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Kobayashi, Satoshi</creatorcontrib><creatorcontrib>Cho, Byunghyun</creatorcontrib><creatorcontrib>Huaulmé, Arnaud</creatorcontrib><creatorcontrib>Tatsugami, Katsunori</creatorcontrib><creatorcontrib>Honda, Hiroshi</creatorcontrib><creatorcontrib>Jannin, Pierre</creatorcontrib><creatorcontrib>Hashizumea, Makoto</creatorcontrib><creatorcontrib>Eto, Masatoshi</creatorcontrib><title>Assessment of surgical skills by using surgical navigation in robot-assisted partial nephrectomy</title><title>International journal for computer assisted radiology and surgery</title><addtitle>Int J CARS</addtitle><addtitle>Int J Comput Assist Radiol Surg</addtitle><description>Purpose
To assess surgical skills in robot-assisted partial nephrectomy (RAPN) with and without surgical navigation (SN).
Methods
We employed an SN system that synchronizes the real-time endoscopic image with a virtual reality three-dimensional (3D) model for RAPN and evaluated the skills of two expert surgeons with regard to the identification and dissection of the renal artery (non-SN group,
n
= 21 [first surgeon
n
= 9, second surgeon
n
= 12]; SN group,
n
= 32 [first surgeon
n
= 11, second surgeon
n
= 21]). We converted all movements of the robotic forceps during RAPN into a dedicated vocabulary. Using RAPN videos, we classified all movements of the robotic forceps into direct action (defined as movements of the robotic forceps that directly affect tissues) and connected motion (defined as movements that link actions). In addition, we analyzed the frequency, duration, and occupancy rate of the connected motion.
Results
In the SN group, the R.E.N.A.L nephrometry score was lower (7 vs. 6,
P
= 0.019) and the time to identify and dissect the renal artery (16 vs. 9 min,
P
= 0.008) was significantly shorter. The connected motions of inefficient “insert,” “pull,” and “rotate” motions were significantly improved by SN. SN significantly improved the frequency, duration, and occupancy rate of connected motions of the right hand of the first surgeon and of both hands of the second surgeon. The improvements in connected motions were positively associated with SN for both surgeons.
Conclusion
This is the first study to investigate SN for nephron-sparing surgery. SN with 3D models might help improve the connected motions of expert surgeons to ensure efficient RAPN.</description><subject>Computer Imaging</subject><subject>Computer Science</subject><subject>Frequency analysis</subject><subject>Health Informatics</subject><subject>Imaging</subject><subject>Medical instruments</subject><subject>Medicine</subject><subject>Medicine & Public Health</subject><subject>Occupancy</subject><subject>Original Article</subject><subject>Pattern Recognition and Graphics</subject><subject>Radiology</subject><subject>Robotics</subject><subject>Robots</subject><subject>Skills</subject><subject>Surgeons</subject><subject>Surgery</subject><subject>Three dimensional models</subject><subject>Time synchronization</subject><subject>Virtual reality</subject><subject>Vision</subject><issn>1861-6410</issn><issn>1861-6429</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNp9kD1PwzAQhi0EoqXwBxhQJBaWgC-OHWesKr6kSiwwG8dxiksSF1-C1H9PSguVGBiss3TPvXd6CDkHeg2UZjcIwFMZU8g3T9JYHpAxSAGxSJP88PcPdEROEJeUpjxj_JiMGADkqRRj8jpFtIiNbbvIVxH2YeGMriN8d3WNUbGOenTtYt9o9adb6M75NnJtFHzhu1gjOuxsGa106NwGsqu3YE3nm_UpOap0jfZsVyfk5e72efYQz5_uH2fTeWy4yLu4yllGS5txXiUlLUAnkJqslNJqwTRNjeTSFNxomelCZCwXpqJlCrZiZQk6YxNytc1dBf_RW-xU49DYutat9T2qJGEJJJRKPqCXf9Cl70M7XDdQXIDIaQoDlWwpEzxisJVaBdfosFZA1ca_2vpXg3v17V_JYehiF90XjS1_R36EDwDbAji02oUN-93_xH4BqsWR2A</recordid><startdate>20190801</startdate><enddate>20190801</enddate><creator>Kobayashi, Satoshi</creator><creator>Cho, Byunghyun</creator><creator>Huaulmé, Arnaud</creator><creator>Tatsugami, Katsunori</creator><creator>Honda, Hiroshi</creator><creator>Jannin, Pierre</creator><creator>Hashizumea, Makoto</creator><creator>Eto, Masatoshi</creator><general>Springer International Publishing</general><general>Springer Nature B.V</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>20190801</creationdate><title>Assessment of surgical skills by using surgical navigation in robot-assisted partial nephrectomy</title><author>Kobayashi, Satoshi ; Cho, Byunghyun ; Huaulmé, Arnaud ; Tatsugami, Katsunori ; Honda, Hiroshi ; Jannin, Pierre ; Hashizumea, Makoto ; Eto, Masatoshi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c569t-f9370de755f2d0b1a214c7d88ea63a04c858cb5ca87ab67396cf0d41ef3dd1a73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Computer Imaging</topic><topic>Computer Science</topic><topic>Frequency analysis</topic><topic>Health Informatics</topic><topic>Imaging</topic><topic>Medical instruments</topic><topic>Medicine</topic><topic>Medicine & Public Health</topic><topic>Occupancy</topic><topic>Original Article</topic><topic>Pattern Recognition and Graphics</topic><topic>Radiology</topic><topic>Robotics</topic><topic>Robots</topic><topic>Skills</topic><topic>Surgeons</topic><topic>Surgery</topic><topic>Three dimensional models</topic><topic>Time synchronization</topic><topic>Virtual reality</topic><topic>Vision</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kobayashi, Satoshi</creatorcontrib><creatorcontrib>Cho, Byunghyun</creatorcontrib><creatorcontrib>Huaulmé, Arnaud</creatorcontrib><creatorcontrib>Tatsugami, Katsunori</creatorcontrib><creatorcontrib>Honda, Hiroshi</creatorcontrib><creatorcontrib>Jannin, Pierre</creatorcontrib><creatorcontrib>Hashizumea, Makoto</creatorcontrib><creatorcontrib>Eto, Masatoshi</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>International journal for computer assisted radiology and surgery</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kobayashi, Satoshi</au><au>Cho, Byunghyun</au><au>Huaulmé, Arnaud</au><au>Tatsugami, Katsunori</au><au>Honda, Hiroshi</au><au>Jannin, Pierre</au><au>Hashizumea, Makoto</au><au>Eto, Masatoshi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Assessment of surgical skills by using surgical navigation in robot-assisted partial nephrectomy</atitle><jtitle>International journal for computer assisted radiology and surgery</jtitle><stitle>Int J CARS</stitle><addtitle>Int J Comput Assist Radiol Surg</addtitle><date>2019-08-01</date><risdate>2019</risdate><volume>14</volume><issue>8</issue><spage>1449</spage><epage>1459</epage><pages>1449-1459</pages><issn>1861-6410</issn><eissn>1861-6429</eissn><abstract>Purpose
To assess surgical skills in robot-assisted partial nephrectomy (RAPN) with and without surgical navigation (SN).
Methods
We employed an SN system that synchronizes the real-time endoscopic image with a virtual reality three-dimensional (3D) model for RAPN and evaluated the skills of two expert surgeons with regard to the identification and dissection of the renal artery (non-SN group,
n
= 21 [first surgeon
n
= 9, second surgeon
n
= 12]; SN group,
n
= 32 [first surgeon
n
= 11, second surgeon
n
= 21]). We converted all movements of the robotic forceps during RAPN into a dedicated vocabulary. Using RAPN videos, we classified all movements of the robotic forceps into direct action (defined as movements of the robotic forceps that directly affect tissues) and connected motion (defined as movements that link actions). In addition, we analyzed the frequency, duration, and occupancy rate of the connected motion.
Results
In the SN group, the R.E.N.A.L nephrometry score was lower (7 vs. 6,
P
= 0.019) and the time to identify and dissect the renal artery (16 vs. 9 min,
P
= 0.008) was significantly shorter. The connected motions of inefficient “insert,” “pull,” and “rotate” motions were significantly improved by SN. SN significantly improved the frequency, duration, and occupancy rate of connected motions of the right hand of the first surgeon and of both hands of the second surgeon. The improvements in connected motions were positively associated with SN for both surgeons.
Conclusion
This is the first study to investigate SN for nephron-sparing surgery. SN with 3D models might help improve the connected motions of expert surgeons to ensure efficient RAPN.</abstract><cop>Cham</cop><pub>Springer International Publishing</pub><pmid>31119486</pmid><doi>10.1007/s11548-019-01980-8</doi><tpages>11</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1861-6410 |
ispartof | International journal for computer assisted radiology and surgery, 2019-08, Vol.14 (8), p.1449-1459 |
issn | 1861-6410 1861-6429 |
language | eng |
recordid | cdi_proquest_miscellaneous_2232120085 |
source | SpringerNature Journals |
subjects | Computer Imaging Computer Science Frequency analysis Health Informatics Imaging Medical instruments Medicine Medicine & Public Health Occupancy Original Article Pattern Recognition and Graphics Radiology Robotics Robots Skills Surgeons Surgery Three dimensional models Time synchronization Virtual reality Vision |
title | Assessment of surgical skills by using surgical navigation in robot-assisted partial nephrectomy |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-02T08%3A04%3A55IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Assessment%20of%20surgical%20skills%20by%20using%20surgical%20navigation%20in%20robot-assisted%20partial%20nephrectomy&rft.jtitle=International%20journal%20for%20computer%20assisted%20radiology%20and%20surgery&rft.au=Kobayashi,%20Satoshi&rft.date=2019-08-01&rft.volume=14&rft.issue=8&rft.spage=1449&rft.epage=1459&rft.pages=1449-1459&rft.issn=1861-6410&rft.eissn=1861-6429&rft_id=info:doi/10.1007/s11548-019-01980-8&rft_dat=%3Cproquest_cross%3E2232120085%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2256169041&rft_id=info:pmid/31119486&rfr_iscdi=true |