DeepNav: Joint View Learning for Direct Optimal Path Perception in Cochlear Surgical Platform Navigation

Although much research has been conducted in the field of automated cochlear implant navigation, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other doma...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2023, Vol.11, p.120593-120602
Hauptverfasser: Zamani, Majid, Demosthenous, Andreas
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 120602
container_issue
container_start_page 120593
container_title IEEE access
container_volume 11
creator Zamani, Majid
Demosthenous, Andreas
description Although much research has been conducted in the field of automated cochlear implant navigation, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other domains, such as identifying the optimal navigation zone (OPZ) in the cochlear. In this paper, a 2.5D joint-view convolutional neural network (2.5D CNN) is proposed and evaluated for the identification of the OPZ in the cochlear segments. The proposed network consists of two complementary sagittal and bird-view (or top view) networks for the 3D OPZ recognition, each utilizing a ResNet-8 architecture consisting of five convolutional layers with rectified nonlinearity unit (ReLU) activations, followed by average pooling with a size equal to the size of the final feature maps. The last fully connected layer of each network has four indicators, equivalent to the classes considered: the distance to the adjacent left and right walls, collision probability and heading angle. To demonstrate this, the 2.5D CNN was trained using a parametric data generation model, and then evaluated using anatomically constructed cochlea models from micro-CT images of different cases. Prediction of the indicators demonstrates the effectiveness of the 2.5D CNN, for example, the heading angle has less than 1° error with computation delays of less that
doi_str_mv 10.1109/ACCESS.2023.3320557
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2885652039</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10266334</ieee_id><doaj_id>oai_doaj_org_article_3455a12316864de0a16a31ddd2118c52</doaj_id><sourcerecordid>2885652039</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-c48caa36822fdd8cad9becba811efded6253617147445f523c4560c1bf0c9753</originalsourceid><addsrcrecordid>eNpNkV9LwzAUxYMoKNNPoA8BnzeT3CZtfZP6bzJUmPgasuR2y6jNTDvFb29mRcxLLofzO7nhEHLK2YRzVl5cVdXNfD4RTMAEQDAp8z1yJLgqxyBB7f-bD8lJ161ZOkWSZH5EVteIm0fzcUkfgm97-urxk87QxNa3S1qHSK99RNvTp03v30xDn02_os8YLSYhtNS3tAp21SSEzrdx6e3O1Jg-sW80Jful2RmPyUFtmg5Pfu8Rebm9eanux7Onu2l1NRtbkGU_tllhjQFVCFE7l2ZXLtAuTME51g6dEukbPOdZnmWylgJsJhWzfFEzW-YSRmQ6xLpg1noT087xSwfj9Y8Q4lKb2HvboIZMSsMFcFWozCEzXBngzjnBeWFT9IicD1mbGN632PV6HbaxTdtrURRSScGgTC4YXDaGrotY_73Kmd4VpIeC9K4g_VtQos4GyiPiP0IoBZDBN1mxi60</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2885652039</pqid></control><display><type>article</type><title>DeepNav: Joint View Learning for Direct Optimal Path Perception in Cochlear Surgical Platform Navigation</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Zamani, Majid ; Demosthenous, Andreas</creator><creatorcontrib>Zamani, Majid ; Demosthenous, Andreas</creatorcontrib><description>Although much research has been conducted in the field of automated cochlear implant navigation, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other domains, such as identifying the optimal navigation zone (OPZ) in the cochlear. In this paper, a 2.5D joint-view convolutional neural network (2.5D CNN) is proposed and evaluated for the identification of the OPZ in the cochlear segments. The proposed network consists of two complementary sagittal and bird-view (or top view) networks for the 3D OPZ recognition, each utilizing a ResNet-8 architecture consisting of five convolutional layers with rectified nonlinearity unit (ReLU) activations, followed by average pooling with a size equal to the size of the final feature maps. The last fully connected layer of each network has four indicators, equivalent to the classes considered: the distance to the adjacent left and right walls, collision probability and heading angle. To demonstrate this, the 2.5D CNN was trained using a parametric data generation model, and then evaluated using anatomically constructed cochlea models from micro-CT images of different cases. Prediction of the indicators demonstrates the effectiveness of the 2.5D CNN, for example, the heading angle has less than 1° error with computation delays of less that &lt;1 milliseconds.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2023.3320557</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Artificial neural networks ; Automated insertion ; Cochlea ; cochlear implant ; Cochlear implants ; Collision dynamics ; Computed tomography ; Computer vision ; convolutional neural network ; Convolutional neural networks ; Deep learning ; Ear ; Electrodes ; Feature maps ; Indicators ; low-cost navigation ; Machine learning ; Navigation ; Real-time systems ; robust centerline tracing ; Surgery ; Three-dimensional displays ; Virtual reality ; virtual surgery</subject><ispartof>IEEE access, 2023, Vol.11, p.120593-120602</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c359t-c48caa36822fdd8cad9becba811efded6253617147445f523c4560c1bf0c9753</cites><orcidid>0000-0003-0623-963X ; 0000-0002-8986-757X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10266334$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,860,2096,4010,27610,27900,27901,27902,54908</link.rule.ids></links><search><creatorcontrib>Zamani, Majid</creatorcontrib><creatorcontrib>Demosthenous, Andreas</creatorcontrib><title>DeepNav: Joint View Learning for Direct Optimal Path Perception in Cochlear Surgical Platform Navigation</title><title>IEEE access</title><addtitle>Access</addtitle><description>Although much research has been conducted in the field of automated cochlear implant navigation, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other domains, such as identifying the optimal navigation zone (OPZ) in the cochlear. In this paper, a 2.5D joint-view convolutional neural network (2.5D CNN) is proposed and evaluated for the identification of the OPZ in the cochlear segments. The proposed network consists of two complementary sagittal and bird-view (or top view) networks for the 3D OPZ recognition, each utilizing a ResNet-8 architecture consisting of five convolutional layers with rectified nonlinearity unit (ReLU) activations, followed by average pooling with a size equal to the size of the final feature maps. The last fully connected layer of each network has four indicators, equivalent to the classes considered: the distance to the adjacent left and right walls, collision probability and heading angle. To demonstrate this, the 2.5D CNN was trained using a parametric data generation model, and then evaluated using anatomically constructed cochlea models from micro-CT images of different cases. Prediction of the indicators demonstrates the effectiveness of the 2.5D CNN, for example, the heading angle has less than 1° error with computation delays of less that &lt;1 milliseconds.</description><subject>Artificial neural networks</subject><subject>Automated insertion</subject><subject>Cochlea</subject><subject>cochlear implant</subject><subject>Cochlear implants</subject><subject>Collision dynamics</subject><subject>Computed tomography</subject><subject>Computer vision</subject><subject>convolutional neural network</subject><subject>Convolutional neural networks</subject><subject>Deep learning</subject><subject>Ear</subject><subject>Electrodes</subject><subject>Feature maps</subject><subject>Indicators</subject><subject>low-cost navigation</subject><subject>Machine learning</subject><subject>Navigation</subject><subject>Real-time systems</subject><subject>robust centerline tracing</subject><subject>Surgery</subject><subject>Three-dimensional displays</subject><subject>Virtual reality</subject><subject>virtual surgery</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNkV9LwzAUxYMoKNNPoA8BnzeT3CZtfZP6bzJUmPgasuR2y6jNTDvFb29mRcxLLofzO7nhEHLK2YRzVl5cVdXNfD4RTMAEQDAp8z1yJLgqxyBB7f-bD8lJ161ZOkWSZH5EVteIm0fzcUkfgm97-urxk87QxNa3S1qHSK99RNvTp03v30xDn02_os8YLSYhtNS3tAp21SSEzrdx6e3O1Jg-sW80Jful2RmPyUFtmg5Pfu8Rebm9eanux7Onu2l1NRtbkGU_tllhjQFVCFE7l2ZXLtAuTME51g6dEukbPOdZnmWylgJsJhWzfFEzW-YSRmQ6xLpg1noT087xSwfj9Y8Q4lKb2HvboIZMSsMFcFWozCEzXBngzjnBeWFT9IicD1mbGN632PV6HbaxTdtrURRSScGgTC4YXDaGrotY_73Kmd4VpIeC9K4g_VtQos4GyiPiP0IoBZDBN1mxi60</recordid><startdate>2023</startdate><enddate>2023</enddate><creator>Zamani, Majid</creator><creator>Demosthenous, Andreas</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-0623-963X</orcidid><orcidid>https://orcid.org/0000-0002-8986-757X</orcidid></search><sort><creationdate>2023</creationdate><title>DeepNav: Joint View Learning for Direct Optimal Path Perception in Cochlear Surgical Platform Navigation</title><author>Zamani, Majid ; Demosthenous, Andreas</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-c48caa36822fdd8cad9becba811efded6253617147445f523c4560c1bf0c9753</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Artificial neural networks</topic><topic>Automated insertion</topic><topic>Cochlea</topic><topic>cochlear implant</topic><topic>Cochlear implants</topic><topic>Collision dynamics</topic><topic>Computed tomography</topic><topic>Computer vision</topic><topic>convolutional neural network</topic><topic>Convolutional neural networks</topic><topic>Deep learning</topic><topic>Ear</topic><topic>Electrodes</topic><topic>Feature maps</topic><topic>Indicators</topic><topic>low-cost navigation</topic><topic>Machine learning</topic><topic>Navigation</topic><topic>Real-time systems</topic><topic>robust centerline tracing</topic><topic>Surgery</topic><topic>Three-dimensional displays</topic><topic>Virtual reality</topic><topic>virtual surgery</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zamani, Majid</creatorcontrib><creatorcontrib>Demosthenous, Andreas</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zamani, Majid</au><au>Demosthenous, Andreas</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>DeepNav: Joint View Learning for Direct Optimal Path Perception in Cochlear Surgical Platform Navigation</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2023</date><risdate>2023</risdate><volume>11</volume><spage>120593</spage><epage>120602</epage><pages>120593-120602</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Although much research has been conducted in the field of automated cochlear implant navigation, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other domains, such as identifying the optimal navigation zone (OPZ) in the cochlear. In this paper, a 2.5D joint-view convolutional neural network (2.5D CNN) is proposed and evaluated for the identification of the OPZ in the cochlear segments. The proposed network consists of two complementary sagittal and bird-view (or top view) networks for the 3D OPZ recognition, each utilizing a ResNet-8 architecture consisting of five convolutional layers with rectified nonlinearity unit (ReLU) activations, followed by average pooling with a size equal to the size of the final feature maps. The last fully connected layer of each network has four indicators, equivalent to the classes considered: the distance to the adjacent left and right walls, collision probability and heading angle. To demonstrate this, the 2.5D CNN was trained using a parametric data generation model, and then evaluated using anatomically constructed cochlea models from micro-CT images of different cases. Prediction of the indicators demonstrates the effectiveness of the 2.5D CNN, for example, the heading angle has less than 1° error with computation delays of less that &lt;1 milliseconds.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2023.3320557</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0003-0623-963X</orcidid><orcidid>https://orcid.org/0000-0002-8986-757X</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2023, Vol.11, p.120593-120602
issn 2169-3536
2169-3536
language eng
recordid cdi_proquest_journals_2885652039
source IEEE Open Access Journals; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals
subjects Artificial neural networks
Automated insertion
Cochlea
cochlear implant
Cochlear implants
Collision dynamics
Computed tomography
Computer vision
convolutional neural network
Convolutional neural networks
Deep learning
Ear
Electrodes
Feature maps
Indicators
low-cost navigation
Machine learning
Navigation
Real-time systems
robust centerline tracing
Surgery
Three-dimensional displays
Virtual reality
virtual surgery
title DeepNav: Joint View Learning for Direct Optimal Path Perception in Cochlear Surgical Platform Navigation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T22%3A14%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=DeepNav:%20Joint%20View%20Learning%20for%20Direct%20Optimal%20Path%20Perception%20in%20Cochlear%20Surgical%20Platform%20Navigation&rft.jtitle=IEEE%20access&rft.au=Zamani,%20Majid&rft.date=2023&rft.volume=11&rft.spage=120593&rft.epage=120602&rft.pages=120593-120602&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2023.3320557&rft_dat=%3Cproquest_cross%3E2885652039%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2885652039&rft_id=info:pmid/&rft_ieee_id=10266334&rft_doaj_id=oai_doaj_org_article_3455a12316864de0a16a31ddd2118c52&rfr_iscdi=true