Directing a virtual agent based on eye behaviour of a user

In an extended reality (XR), virtual reality (VR), augmented reality (AR) or mixed reality (MR) environment, the actions of a virtual object are directed by a user based on the user's eye behaviour, with the environment displaying a user avatar that includes a representation of the user's...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Dan Feng, Mu Qiao, Bo Morgan, Mark Drummond
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Dan Feng
Mu Qiao
Bo Morgan
Mark Drummond
description In an extended reality (XR), virtual reality (VR), augmented reality (AR) or mixed reality (MR) environment, the actions of a virtual object are directed by a user based on the user's eye behaviour, with the environment displaying a user avatar that includes a representation of the user's eye(s). The virtual object is displayed (302) on a display, and is associated with a first viewing frustum containing (304) the user avatar which includes a visual representation of one or more eyes. Whilst displaying the virtual object, eye tracking data is obtained (306) which is indicative of the user's eye behaviour, the visual representation of the eye(s) is updated (312) based on the eye behaviour, and the virtual object is directed (316) to perform an action based on the updated eye(s) representation as well as scene information associated with the device. The eye behaviour may include a movement (308) of the eye of the user from a first focus position to a second focus position, and this movement may include a saccade (310) which may be directed to an object of interest.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_GB2609308A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>GB2609308A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_GB2609308A3</originalsourceid><addsrcrecordid>eNrjZLByySxKTS7JzEtXSFQoyywqKU3MUUhMT80rUUhKLE5NUcjPU0itTFVISs1ILMvMLy1SyE8DqiwtTi3iYWBNS8wpTuWF0twM8m6uIc4euqkF-fGpxQWJyal5qSXx7k5GZgaWxgYWjsaEVQAAcZctbg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Directing a virtual agent based on eye behaviour of a user</title><source>esp@cenet</source><creator>Dan Feng ; Mu Qiao ; Bo Morgan ; Mark Drummond</creator><creatorcontrib>Dan Feng ; Mu Qiao ; Bo Morgan ; Mark Drummond</creatorcontrib><description>In an extended reality (XR), virtual reality (VR), augmented reality (AR) or mixed reality (MR) environment, the actions of a virtual object are directed by a user based on the user's eye behaviour, with the environment displaying a user avatar that includes a representation of the user's eye(s). The virtual object is displayed (302) on a display, and is associated with a first viewing frustum containing (304) the user avatar which includes a visual representation of one or more eyes. Whilst displaying the virtual object, eye tracking data is obtained (306) which is indicative of the user's eye behaviour, the visual representation of the eye(s) is updated (312) based on the eye behaviour, and the virtual object is directed (316) to perform an action based on the updated eye(s) representation as well as scene information associated with the device. The eye behaviour may include a movement (308) of the eye of the user from a first focus position to a second focus position, and this movement may include a saccade (310) which may be directed to an object of interest.</description><language>eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; PHYSICS</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230201&amp;DB=EPODOC&amp;CC=GB&amp;NR=2609308A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25563,76318</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230201&amp;DB=EPODOC&amp;CC=GB&amp;NR=2609308A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Dan Feng</creatorcontrib><creatorcontrib>Mu Qiao</creatorcontrib><creatorcontrib>Bo Morgan</creatorcontrib><creatorcontrib>Mark Drummond</creatorcontrib><title>Directing a virtual agent based on eye behaviour of a user</title><description>In an extended reality (XR), virtual reality (VR), augmented reality (AR) or mixed reality (MR) environment, the actions of a virtual object are directed by a user based on the user's eye behaviour, with the environment displaying a user avatar that includes a representation of the user's eye(s). The virtual object is displayed (302) on a display, and is associated with a first viewing frustum containing (304) the user avatar which includes a visual representation of one or more eyes. Whilst displaying the virtual object, eye tracking data is obtained (306) which is indicative of the user's eye behaviour, the visual representation of the eye(s) is updated (312) based on the eye behaviour, and the virtual object is directed (316) to perform an action based on the updated eye(s) representation as well as scene information associated with the device. The eye behaviour may include a movement (308) of the eye of the user from a first focus position to a second focus position, and this movement may include a saccade (310) which may be directed to an object of interest.</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZLByySxKTS7JzEtXSFQoyywqKU3MUUhMT80rUUhKLE5NUcjPU0itTFVISs1ILMvMLy1SyE8DqiwtTi3iYWBNS8wpTuWF0twM8m6uIc4euqkF-fGpxQWJyal5qSXx7k5GZgaWxgYWjsaEVQAAcZctbg</recordid><startdate>20230201</startdate><enddate>20230201</enddate><creator>Dan Feng</creator><creator>Mu Qiao</creator><creator>Bo Morgan</creator><creator>Mark Drummond</creator><scope>EVB</scope></search><sort><creationdate>20230201</creationdate><title>Directing a virtual agent based on eye behaviour of a user</title><author>Dan Feng ; Mu Qiao ; Bo Morgan ; Mark Drummond</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_GB2609308A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2023</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>Dan Feng</creatorcontrib><creatorcontrib>Mu Qiao</creatorcontrib><creatorcontrib>Bo Morgan</creatorcontrib><creatorcontrib>Mark Drummond</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Dan Feng</au><au>Mu Qiao</au><au>Bo Morgan</au><au>Mark Drummond</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Directing a virtual agent based on eye behaviour of a user</title><date>2023-02-01</date><risdate>2023</risdate><abstract>In an extended reality (XR), virtual reality (VR), augmented reality (AR) or mixed reality (MR) environment, the actions of a virtual object are directed by a user based on the user's eye behaviour, with the environment displaying a user avatar that includes a representation of the user's eye(s). The virtual object is displayed (302) on a display, and is associated with a first viewing frustum containing (304) the user avatar which includes a visual representation of one or more eyes. Whilst displaying the virtual object, eye tracking data is obtained (306) which is indicative of the user's eye behaviour, the visual representation of the eye(s) is updated (312) based on the eye behaviour, and the virtual object is directed (316) to perform an action based on the updated eye(s) representation as well as scene information associated with the device. The eye behaviour may include a movement (308) of the eye of the user from a first focus position to a second focus position, and this movement may include a saccade (310) which may be directed to an object of interest.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng
recordid cdi_epo_espacenet_GB2609308A
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
PHYSICS
title Directing a virtual agent based on eye behaviour of a user
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-10T15%3A21%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Dan%20Feng&rft.date=2023-02-01&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EGB2609308A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true