System and method for identifying and navigating anatomical objects using deep learning networks

A method for scanning, identifying, and navigating at least one anatomical object of a patient via an imaging system includes scanning the anatomical object via a probe of the imaging system, identifying the anatomical object via the probe, and navigating the anatomical object via the probe. Further...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Avendi, Michael R, Duffy, Shane A
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Avendi, Michael R
Duffy, Shane A
description A method for scanning, identifying, and navigating at least one anatomical object of a patient via an imaging system includes scanning the anatomical object via a probe of the imaging system, identifying the anatomical object via the probe, and navigating the anatomical object via the probe. Further, the method includes collecting data relating to operation of the probe during the scanning, identifying, and navigating steps. Moreover, the method includes inputting the collected data into a deep learning network configured to learn the scanning, identifying, and navigating steps. In addition, the method includes generating a probe visualization guide for an operator based on the deep learning network. Thus, the method also includes displaying the probe visualization guide to the operator via a user display of the imaging system, wherein the probe visualization guide instructs the operator how to maneuver the probe so as to locate the anatomical object. In addition, the method also includes using haptic feedback in the probe to guide the operator to the anatomical object of the patient.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_AU2018279877A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>AU2018279877A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_AU2018279877A13</originalsourceid><addsrcrecordid>eNqNirEKwjAUALM4iPoPD5wFW4e0YxHFXZ3rM3mp0TQpyVPp32vVD3A6jruxOO37xNQCeg0t8SVoMCGC1eTZmt765pM8PmyD_FXk0FqFDsL5SooT3NMQNFEHjjD6wTzxM8RbmoqRQZdo9uNEzLebw3q3oC7UlDpU9F7r6pgvsyKXZSFlla3-u17y5z7G</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>System and method for identifying and navigating anatomical objects using deep learning networks</title><source>esp@cenet</source><creator>Avendi, Michael R ; Duffy, Shane A</creator><creatorcontrib>Avendi, Michael R ; Duffy, Shane A</creatorcontrib><description>A method for scanning, identifying, and navigating at least one anatomical object of a patient via an imaging system includes scanning the anatomical object via a probe of the imaging system, identifying the anatomical object via the probe, and navigating the anatomical object via the probe. Further, the method includes collecting data relating to operation of the probe during the scanning, identifying, and navigating steps. Moreover, the method includes inputting the collected data into a deep learning network configured to learn the scanning, identifying, and navigating steps. In addition, the method includes generating a probe visualization guide for an operator based on the deep learning network. Thus, the method also includes displaying the probe visualization guide to the operator via a user display of the imaging system, wherein the probe visualization guide instructs the operator how to maneuver the probe so as to locate the anatomical object. In addition, the method also includes using haptic feedback in the probe to guide the operator to the anatomical object of the patient.</description><language>eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; DIAGNOSIS ; HUMAN NECESSITIES ; HYGIENE ; IDENTIFICATION ; MEDICAL OR VETERINARY SCIENCE ; PHYSICS ; SURGERY</subject><creationdate>2019</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20191212&amp;DB=EPODOC&amp;CC=AU&amp;NR=2018279877A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20191212&amp;DB=EPODOC&amp;CC=AU&amp;NR=2018279877A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Avendi, Michael R</creatorcontrib><creatorcontrib>Duffy, Shane A</creatorcontrib><title>System and method for identifying and navigating anatomical objects using deep learning networks</title><description>A method for scanning, identifying, and navigating at least one anatomical object of a patient via an imaging system includes scanning the anatomical object via a probe of the imaging system, identifying the anatomical object via the probe, and navigating the anatomical object via the probe. Further, the method includes collecting data relating to operation of the probe during the scanning, identifying, and navigating steps. Moreover, the method includes inputting the collected data into a deep learning network configured to learn the scanning, identifying, and navigating steps. In addition, the method includes generating a probe visualization guide for an operator based on the deep learning network. Thus, the method also includes displaying the probe visualization guide to the operator via a user display of the imaging system, wherein the probe visualization guide instructs the operator how to maneuver the probe so as to locate the anatomical object. In addition, the method also includes using haptic feedback in the probe to guide the operator to the anatomical object of the patient.</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>DIAGNOSIS</subject><subject>HUMAN NECESSITIES</subject><subject>HYGIENE</subject><subject>IDENTIFICATION</subject><subject>MEDICAL OR VETERINARY SCIENCE</subject><subject>PHYSICS</subject><subject>SURGERY</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2019</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNirEKwjAUALM4iPoPD5wFW4e0YxHFXZ3rM3mp0TQpyVPp32vVD3A6jruxOO37xNQCeg0t8SVoMCGC1eTZmt765pM8PmyD_FXk0FqFDsL5SooT3NMQNFEHjjD6wTzxM8RbmoqRQZdo9uNEzLebw3q3oC7UlDpU9F7r6pgvsyKXZSFlla3-u17y5z7G</recordid><startdate>20191212</startdate><enddate>20191212</enddate><creator>Avendi, Michael R</creator><creator>Duffy, Shane A</creator><scope>EVB</scope></search><sort><creationdate>20191212</creationdate><title>System and method for identifying and navigating anatomical objects using deep learning networks</title><author>Avendi, Michael R ; Duffy, Shane A</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_AU2018279877A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2019</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>DIAGNOSIS</topic><topic>HUMAN NECESSITIES</topic><topic>HYGIENE</topic><topic>IDENTIFICATION</topic><topic>MEDICAL OR VETERINARY SCIENCE</topic><topic>PHYSICS</topic><topic>SURGERY</topic><toplevel>online_resources</toplevel><creatorcontrib>Avendi, Michael R</creatorcontrib><creatorcontrib>Duffy, Shane A</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Avendi, Michael R</au><au>Duffy, Shane A</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>System and method for identifying and navigating anatomical objects using deep learning networks</title><date>2019-12-12</date><risdate>2019</risdate><abstract>A method for scanning, identifying, and navigating at least one anatomical object of a patient via an imaging system includes scanning the anatomical object via a probe of the imaging system, identifying the anatomical object via the probe, and navigating the anatomical object via the probe. Further, the method includes collecting data relating to operation of the probe during the scanning, identifying, and navigating steps. Moreover, the method includes inputting the collected data into a deep learning network configured to learn the scanning, identifying, and navigating steps. In addition, the method includes generating a probe visualization guide for an operator based on the deep learning network. Thus, the method also includes displaying the probe visualization guide to the operator via a user display of the imaging system, wherein the probe visualization guide instructs the operator how to maneuver the probe so as to locate the anatomical object. In addition, the method also includes using haptic feedback in the probe to guide the operator to the anatomical object of the patient.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng
recordid cdi_epo_espacenet_AU2018279877A1
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
DIAGNOSIS
HUMAN NECESSITIES
HYGIENE
IDENTIFICATION
MEDICAL OR VETERINARY SCIENCE
PHYSICS
SURGERY
title System and method for identifying and navigating anatomical objects using deep learning networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T13%3A49%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Avendi,%20Michael%20R&rft.date=2019-12-12&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EAU2018279877A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true