Gaze-contingent processing improves mobility, scene recognition and visual search in simulated head-steered prosthetic vision

The enabling technology of visual prosthetics for the blind is making rapid progress. However, there are still uncertainties regarding the functional outcomes, which can depend on many design choices in the development. In visual prostheses with a head-mounted camera, a particularly challenging ques...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of neural engineering 2024-04, Vol.21 (2), p.26037
Hauptverfasser: de Ruyter van Steveninck, Jaap, Nipshagen, Mo, van Gerven, Marcel, Güçlü, Umut, Güçlüturk, Yağmur, van Wezel, Richard
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 2
container_start_page 26037
container_title Journal of neural engineering
container_volume 21
creator de Ruyter van Steveninck, Jaap
Nipshagen, Mo
van Gerven, Marcel
Güçlü, Umut
Güçlüturk, Yağmur
van Wezel, Richard
description The enabling technology of visual prosthetics for the blind is making rapid progress. However, there are still uncertainties regarding the functional outcomes, which can depend on many design choices in the development. In visual prostheses with a head-mounted camera, a particularly challenging question is how to deal with the gaze-locked visual percept associated with spatial updating conflicts in the brain. The current study investigates a recently proposed compensation strategy based on gaze-contingent image processing with eye-tracking. Gaze-contingent processing is expected to reinforce natural-like visual scanning and reestablished spatial updating based on eye movements. The beneficial effects remain to be investigated for daily life activities in complex visual environments. The current study evaluates the benefits of gaze-contingent processing versus gaze-locked and gaze-ignored simulations in the context of mobility, scene recognition and visual search, using a virtual reality simulated prosthetic vision paradigm with sighted subjects. Compared to gaze-locked vision, gaze-contingent processing was consistently found to improve the speed in all experimental tasks, as well as the subjective quality of vision. Similar or further improvements were found in a control condition that ignores gaze-dependent effects, a simulation that is unattainable in the clinical reality. Our results suggest that gaze-locked vision and spatial updating conflicts can be debilitating for complex visually-guided activities of daily living such as mobility and orientation. Therefore, for prospective users of head-steered prostheses with an unimpaired oculomotor system, the inclusion of a compensatory eye-tracking system is strongly endorsed.
doi_str_mv 10.1088/1741-2552/ad357d
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmed_primary_38502957</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2972703547</sourcerecordid><originalsourceid>FETCH-LOGICAL-c363t-1c06e2bdbf0ae61ee33369cd01a9238cc84f62d9ece19cae3ab03f82acdb76963</originalsourceid><addsrcrecordid>eNp1kDtPwzAUhS0EolDYmZA3GBrwo7GTEVW8JCQWmC3HvqGuEqfETqUi8d9x1dIJJvtY3zm-9yB0QckNJUVxS-WUZizP2a22PJf2AJ3snw73d0FG6DSEBSGcypIcoxEvcsLKXJ6g70f9BZnpfHT-A3zEy74zEEJS2LVJrCDgtqtc4-J6goMBD7gH0314F13nsfYWr1wYdIMD6N7MsfM4uHZodASL56BtFiJAn0SKC3EO0ZmNJbnP0FGtmwDnu3OM3h_u32ZP2cvr4_Ps7iUzXPCYUUMEsMpWNdEgKADnXJTGEqpLxgtjimktmC3BAC2NBq4rwuuCaWMrKUrBx-h6m5sm-BwgRNW6tErTaA_dEBQrJZOE51OZULJFTRo29FCrZe9a3a8VJWpTutq0qjYNq23pyXK5Sx-qFuze8NtyAq62gOuWatENvU_LqoUHxVKSIkwQLtXS1omc_EH--_MPeHCcQw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2972703547</pqid></control><display><type>article</type><title>Gaze-contingent processing improves mobility, scene recognition and visual search in simulated head-steered prosthetic vision</title><source>MEDLINE</source><source>Institute of Physics Journals</source><creator>de Ruyter van Steveninck, Jaap ; Nipshagen, Mo ; van Gerven, Marcel ; Güçlü, Umut ; Güçlüturk, Yağmur ; van Wezel, Richard</creator><creatorcontrib>de Ruyter van Steveninck, Jaap ; Nipshagen, Mo ; van Gerven, Marcel ; Güçlü, Umut ; Güçlüturk, Yağmur ; van Wezel, Richard</creatorcontrib><description>The enabling technology of visual prosthetics for the blind is making rapid progress. However, there are still uncertainties regarding the functional outcomes, which can depend on many design choices in the development. In visual prostheses with a head-mounted camera, a particularly challenging question is how to deal with the gaze-locked visual percept associated with spatial updating conflicts in the brain. The current study investigates a recently proposed compensation strategy based on gaze-contingent image processing with eye-tracking. Gaze-contingent processing is expected to reinforce natural-like visual scanning and reestablished spatial updating based on eye movements. The beneficial effects remain to be investigated for daily life activities in complex visual environments. The current study evaluates the benefits of gaze-contingent processing versus gaze-locked and gaze-ignored simulations in the context of mobility, scene recognition and visual search, using a virtual reality simulated prosthetic vision paradigm with sighted subjects. Compared to gaze-locked vision, gaze-contingent processing was consistently found to improve the speed in all experimental tasks, as well as the subjective quality of vision. Similar or further improvements were found in a control condition that ignores gaze-dependent effects, a simulation that is unattainable in the clinical reality. Our results suggest that gaze-locked vision and spatial updating conflicts can be debilitating for complex visually-guided activities of daily living such as mobility and orientation. Therefore, for prospective users of head-steered prostheses with an unimpaired oculomotor system, the inclusion of a compensatory eye-tracking system is strongly endorsed.</description><identifier>ISSN: 1741-2560</identifier><identifier>ISSN: 1741-2552</identifier><identifier>EISSN: 1741-2552</identifier><identifier>DOI: 10.1088/1741-2552/ad357d</identifier><identifier>PMID: 38502957</identifier><identifier>CODEN: JNEOBH</identifier><language>eng</language><publisher>England: IOP Publishing</publisher><subject>Activities of Daily Living ; artificial vision ; blindness ; Computer Simulation ; cortical visual prosthetics ; Eye Movements ; eye tracking ; Humans ; mobility and orientation ; neuroprosthetics ; Prospective Studies ; virtual reality simulation ; Vision, Ocular</subject><ispartof>Journal of neural engineering, 2024-04, Vol.21 (2), p.26037</ispartof><rights>2024 The Author(s). Published by IOP Publishing Ltd</rights><rights>Creative Commons Attribution license.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c363t-1c06e2bdbf0ae61ee33369cd01a9238cc84f62d9ece19cae3ab03f82acdb76963</cites><orcidid>0000-0002-2711-0889</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://iopscience.iop.org/article/10.1088/1741-2552/ad357d/pdf$$EPDF$$P50$$Giop$$Hfree_for_read</linktopdf><link.rule.ids>314,780,784,27924,27925,53846,53893</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/38502957$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>de Ruyter van Steveninck, Jaap</creatorcontrib><creatorcontrib>Nipshagen, Mo</creatorcontrib><creatorcontrib>van Gerven, Marcel</creatorcontrib><creatorcontrib>Güçlü, Umut</creatorcontrib><creatorcontrib>Güçlüturk, Yağmur</creatorcontrib><creatorcontrib>van Wezel, Richard</creatorcontrib><title>Gaze-contingent processing improves mobility, scene recognition and visual search in simulated head-steered prosthetic vision</title><title>Journal of neural engineering</title><addtitle>JNE</addtitle><addtitle>J. Neural Eng</addtitle><description>The enabling technology of visual prosthetics for the blind is making rapid progress. However, there are still uncertainties regarding the functional outcomes, which can depend on many design choices in the development. In visual prostheses with a head-mounted camera, a particularly challenging question is how to deal with the gaze-locked visual percept associated with spatial updating conflicts in the brain. The current study investigates a recently proposed compensation strategy based on gaze-contingent image processing with eye-tracking. Gaze-contingent processing is expected to reinforce natural-like visual scanning and reestablished spatial updating based on eye movements. The beneficial effects remain to be investigated for daily life activities in complex visual environments. The current study evaluates the benefits of gaze-contingent processing versus gaze-locked and gaze-ignored simulations in the context of mobility, scene recognition and visual search, using a virtual reality simulated prosthetic vision paradigm with sighted subjects. Compared to gaze-locked vision, gaze-contingent processing was consistently found to improve the speed in all experimental tasks, as well as the subjective quality of vision. Similar or further improvements were found in a control condition that ignores gaze-dependent effects, a simulation that is unattainable in the clinical reality. Our results suggest that gaze-locked vision and spatial updating conflicts can be debilitating for complex visually-guided activities of daily living such as mobility and orientation. Therefore, for prospective users of head-steered prostheses with an unimpaired oculomotor system, the inclusion of a compensatory eye-tracking system is strongly endorsed.</description><subject>Activities of Daily Living</subject><subject>artificial vision</subject><subject>blindness</subject><subject>Computer Simulation</subject><subject>cortical visual prosthetics</subject><subject>Eye Movements</subject><subject>eye tracking</subject><subject>Humans</subject><subject>mobility and orientation</subject><subject>neuroprosthetics</subject><subject>Prospective Studies</subject><subject>virtual reality simulation</subject><subject>Vision, Ocular</subject><issn>1741-2560</issn><issn>1741-2552</issn><issn>1741-2552</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>O3W</sourceid><sourceid>EIF</sourceid><recordid>eNp1kDtPwzAUhS0EolDYmZA3GBrwo7GTEVW8JCQWmC3HvqGuEqfETqUi8d9x1dIJJvtY3zm-9yB0QckNJUVxS-WUZizP2a22PJf2AJ3snw73d0FG6DSEBSGcypIcoxEvcsLKXJ6g70f9BZnpfHT-A3zEy74zEEJS2LVJrCDgtqtc4-J6goMBD7gH0314F13nsfYWr1wYdIMD6N7MsfM4uHZodASL56BtFiJAn0SKC3EO0ZmNJbnP0FGtmwDnu3OM3h_u32ZP2cvr4_Ps7iUzXPCYUUMEsMpWNdEgKADnXJTGEqpLxgtjimktmC3BAC2NBq4rwuuCaWMrKUrBx-h6m5sm-BwgRNW6tErTaA_dEBQrJZOE51OZULJFTRo29FCrZe9a3a8VJWpTutq0qjYNq23pyXK5Sx-qFuze8NtyAq62gOuWatENvU_LqoUHxVKSIkwQLtXS1omc_EH--_MPeHCcQw</recordid><startdate>20240401</startdate><enddate>20240401</enddate><creator>de Ruyter van Steveninck, Jaap</creator><creator>Nipshagen, Mo</creator><creator>van Gerven, Marcel</creator><creator>Güçlü, Umut</creator><creator>Güçlüturk, Yağmur</creator><creator>van Wezel, Richard</creator><general>IOP Publishing</general><scope>O3W</scope><scope>TSCCA</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-2711-0889</orcidid></search><sort><creationdate>20240401</creationdate><title>Gaze-contingent processing improves mobility, scene recognition and visual search in simulated head-steered prosthetic vision</title><author>de Ruyter van Steveninck, Jaap ; Nipshagen, Mo ; van Gerven, Marcel ; Güçlü, Umut ; Güçlüturk, Yağmur ; van Wezel, Richard</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c363t-1c06e2bdbf0ae61ee33369cd01a9238cc84f62d9ece19cae3ab03f82acdb76963</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Activities of Daily Living</topic><topic>artificial vision</topic><topic>blindness</topic><topic>Computer Simulation</topic><topic>cortical visual prosthetics</topic><topic>Eye Movements</topic><topic>eye tracking</topic><topic>Humans</topic><topic>mobility and orientation</topic><topic>neuroprosthetics</topic><topic>Prospective Studies</topic><topic>virtual reality simulation</topic><topic>Vision, Ocular</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>de Ruyter van Steveninck, Jaap</creatorcontrib><creatorcontrib>Nipshagen, Mo</creatorcontrib><creatorcontrib>van Gerven, Marcel</creatorcontrib><creatorcontrib>Güçlü, Umut</creatorcontrib><creatorcontrib>Güçlüturk, Yağmur</creatorcontrib><creatorcontrib>van Wezel, Richard</creatorcontrib><collection>Institute of Physics Open Access Journal Titles</collection><collection>IOPscience (Open Access)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Journal of neural engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>de Ruyter van Steveninck, Jaap</au><au>Nipshagen, Mo</au><au>van Gerven, Marcel</au><au>Güçlü, Umut</au><au>Güçlüturk, Yağmur</au><au>van Wezel, Richard</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Gaze-contingent processing improves mobility, scene recognition and visual search in simulated head-steered prosthetic vision</atitle><jtitle>Journal of neural engineering</jtitle><stitle>JNE</stitle><addtitle>J. Neural Eng</addtitle><date>2024-04-01</date><risdate>2024</risdate><volume>21</volume><issue>2</issue><spage>26037</spage><pages>26037-</pages><issn>1741-2560</issn><issn>1741-2552</issn><eissn>1741-2552</eissn><coden>JNEOBH</coden><abstract>The enabling technology of visual prosthetics for the blind is making rapid progress. However, there are still uncertainties regarding the functional outcomes, which can depend on many design choices in the development. In visual prostheses with a head-mounted camera, a particularly challenging question is how to deal with the gaze-locked visual percept associated with spatial updating conflicts in the brain. The current study investigates a recently proposed compensation strategy based on gaze-contingent image processing with eye-tracking. Gaze-contingent processing is expected to reinforce natural-like visual scanning and reestablished spatial updating based on eye movements. The beneficial effects remain to be investigated for daily life activities in complex visual environments. The current study evaluates the benefits of gaze-contingent processing versus gaze-locked and gaze-ignored simulations in the context of mobility, scene recognition and visual search, using a virtual reality simulated prosthetic vision paradigm with sighted subjects. Compared to gaze-locked vision, gaze-contingent processing was consistently found to improve the speed in all experimental tasks, as well as the subjective quality of vision. Similar or further improvements were found in a control condition that ignores gaze-dependent effects, a simulation that is unattainable in the clinical reality. Our results suggest that gaze-locked vision and spatial updating conflicts can be debilitating for complex visually-guided activities of daily living such as mobility and orientation. Therefore, for prospective users of head-steered prostheses with an unimpaired oculomotor system, the inclusion of a compensatory eye-tracking system is strongly endorsed.</abstract><cop>England</cop><pub>IOP Publishing</pub><pmid>38502957</pmid><doi>10.1088/1741-2552/ad357d</doi><tpages>27</tpages><orcidid>https://orcid.org/0000-0002-2711-0889</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1741-2560
ispartof Journal of neural engineering, 2024-04, Vol.21 (2), p.26037
issn 1741-2560
1741-2552
1741-2552
language eng
recordid cdi_pubmed_primary_38502957
source MEDLINE; Institute of Physics Journals
subjects Activities of Daily Living
artificial vision
blindness
Computer Simulation
cortical visual prosthetics
Eye Movements
eye tracking
Humans
mobility and orientation
neuroprosthetics
Prospective Studies
virtual reality simulation
Vision, Ocular
title Gaze-contingent processing improves mobility, scene recognition and visual search in simulated head-steered prosthetic vision
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-23T20%3A25%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Gaze-contingent%20processing%20improves%20mobility,%20scene%20recognition%20and%20visual%20search%20in%20simulated%20head-steered%20prosthetic%20vision&rft.jtitle=Journal%20of%20neural%20engineering&rft.au=de%20Ruyter%20van%20Steveninck,%20Jaap&rft.date=2024-04-01&rft.volume=21&rft.issue=2&rft.spage=26037&rft.pages=26037-&rft.issn=1741-2560&rft.eissn=1741-2552&rft.coden=JNEOBH&rft_id=info:doi/10.1088/1741-2552/ad357d&rft_dat=%3Cproquest_pubme%3E2972703547%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2972703547&rft_id=info:pmid/38502957&rfr_iscdi=true