METHODS AND SYSTEMS OF EXTENDED REALITY ENVIRONMENT INTERACTION BASED ON EYE MOTIONS

Systems and methods are described for extended reality environment interaction. An extended reality environment including an object is generated for display, and a first sensor is used to detect that a gaze has shifted from a first portion of the extended reality environment to a second portion of t...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: PULIKUNTA, Sai Durga Venkat Reddy, T V, Arun Kumar, SAITO, Sakura, ROBERT JOSE, Jeffry Copps, AHER, Ankur Anil, SEN, Susanto, BALAJI, R
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator PULIKUNTA, Sai Durga Venkat Reddy
T V, Arun Kumar
SAITO, Sakura
ROBERT JOSE, Jeffry Copps
AHER, Ankur Anil
SEN, Susanto
BALAJI, R
description Systems and methods are described for extended reality environment interaction. An extended reality environment including an object is generated for display, and a first sensor is used to detect that a gaze has shifted from a first portion of the extended reality environment to a second portion of the extended reality environment, where the object is excluded from the first portion of the extended reality environment and included in the second portion of the extended reality environment. An indicator of the shift in the gaze is generated for display within the extended reality environment in response to detecting the gaze shift, and a voice command is detected by a second sensor while the indicator is in a vicinity of the object. In response to detecting the voice command, an action corresponding to the voice command may be executed.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_AU2020473277A9</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>AU2020473277A9</sourcerecordid><originalsourceid>FETCH-epo_espacenet_AU2020473277A93</originalsourceid><addsrcrecordid>eNqNikEKwjAQAHPxIOofFjwLJRWCx9hsacBsILuKOZUi8SRaqP_HCj7A0wzDLJUElC46BksOOLNgYIgt4FWQHDpIaE9eMiBdfIoUkAQ8CSbbiI8ER8vzNQtmhBC_jddqcR8eU9n8uFLbFqXpdmV89WUah1t5lndvz7rS1d7U2hh7qP-7PgsQMTs</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>METHODS AND SYSTEMS OF EXTENDED REALITY ENVIRONMENT INTERACTION BASED ON EYE MOTIONS</title><source>esp@cenet</source><creator>PULIKUNTA, Sai Durga Venkat Reddy ; T V, Arun Kumar ; SAITO, Sakura ; ROBERT JOSE, Jeffry Copps ; AHER, Ankur Anil ; SEN, Susanto ; BALAJI, R</creator><creatorcontrib>PULIKUNTA, Sai Durga Venkat Reddy ; T V, Arun Kumar ; SAITO, Sakura ; ROBERT JOSE, Jeffry Copps ; AHER, Ankur Anil ; SEN, Susanto ; BALAJI, R</creatorcontrib><description>Systems and methods are described for extended reality environment interaction. An extended reality environment including an object is generated for display, and a first sensor is used to detect that a gaze has shifted from a first portion of the extended reality environment to a second portion of the extended reality environment, where the object is excluded from the first portion of the extended reality environment and included in the second portion of the extended reality environment. An indicator of the shift in the gaze is generated for display within the extended reality environment in response to detecting the gaze shift, and a voice command is detected by a second sensor while the indicator is in a vicinity of the object. In response to detecting the voice command, an action corresponding to the voice command may be executed.</description><language>eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; PHYSICS</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20241003&amp;DB=EPODOC&amp;CC=AU&amp;NR=2020473277A9$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76289</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20241003&amp;DB=EPODOC&amp;CC=AU&amp;NR=2020473277A9$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>PULIKUNTA, Sai Durga Venkat Reddy</creatorcontrib><creatorcontrib>T V, Arun Kumar</creatorcontrib><creatorcontrib>SAITO, Sakura</creatorcontrib><creatorcontrib>ROBERT JOSE, Jeffry Copps</creatorcontrib><creatorcontrib>AHER, Ankur Anil</creatorcontrib><creatorcontrib>SEN, Susanto</creatorcontrib><creatorcontrib>BALAJI, R</creatorcontrib><title>METHODS AND SYSTEMS OF EXTENDED REALITY ENVIRONMENT INTERACTION BASED ON EYE MOTIONS</title><description>Systems and methods are described for extended reality environment interaction. An extended reality environment including an object is generated for display, and a first sensor is used to detect that a gaze has shifted from a first portion of the extended reality environment to a second portion of the extended reality environment, where the object is excluded from the first portion of the extended reality environment and included in the second portion of the extended reality environment. An indicator of the shift in the gaze is generated for display within the extended reality environment in response to detecting the gaze shift, and a voice command is detected by a second sensor while the indicator is in a vicinity of the object. In response to detecting the voice command, an action corresponding to the voice command may be executed.</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNikEKwjAQAHPxIOofFjwLJRWCx9hsacBsILuKOZUi8SRaqP_HCj7A0wzDLJUElC46BksOOLNgYIgt4FWQHDpIaE9eMiBdfIoUkAQ8CSbbiI8ER8vzNQtmhBC_jddqcR8eU9n8uFLbFqXpdmV89WUah1t5lndvz7rS1d7U2hh7qP-7PgsQMTs</recordid><startdate>20241003</startdate><enddate>20241003</enddate><creator>PULIKUNTA, Sai Durga Venkat Reddy</creator><creator>T V, Arun Kumar</creator><creator>SAITO, Sakura</creator><creator>ROBERT JOSE, Jeffry Copps</creator><creator>AHER, Ankur Anil</creator><creator>SEN, Susanto</creator><creator>BALAJI, R</creator><scope>EVB</scope></search><sort><creationdate>20241003</creationdate><title>METHODS AND SYSTEMS OF EXTENDED REALITY ENVIRONMENT INTERACTION BASED ON EYE MOTIONS</title><author>PULIKUNTA, Sai Durga Venkat Reddy ; T V, Arun Kumar ; SAITO, Sakura ; ROBERT JOSE, Jeffry Copps ; AHER, Ankur Anil ; SEN, Susanto ; BALAJI, R</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_AU2020473277A93</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>PULIKUNTA, Sai Durga Venkat Reddy</creatorcontrib><creatorcontrib>T V, Arun Kumar</creatorcontrib><creatorcontrib>SAITO, Sakura</creatorcontrib><creatorcontrib>ROBERT JOSE, Jeffry Copps</creatorcontrib><creatorcontrib>AHER, Ankur Anil</creatorcontrib><creatorcontrib>SEN, Susanto</creatorcontrib><creatorcontrib>BALAJI, R</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>PULIKUNTA, Sai Durga Venkat Reddy</au><au>T V, Arun Kumar</au><au>SAITO, Sakura</au><au>ROBERT JOSE, Jeffry Copps</au><au>AHER, Ankur Anil</au><au>SEN, Susanto</au><au>BALAJI, R</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>METHODS AND SYSTEMS OF EXTENDED REALITY ENVIRONMENT INTERACTION BASED ON EYE MOTIONS</title><date>2024-10-03</date><risdate>2024</risdate><abstract>Systems and methods are described for extended reality environment interaction. An extended reality environment including an object is generated for display, and a first sensor is used to detect that a gaze has shifted from a first portion of the extended reality environment to a second portion of the extended reality environment, where the object is excluded from the first portion of the extended reality environment and included in the second portion of the extended reality environment. An indicator of the shift in the gaze is generated for display within the extended reality environment in response to detecting the gaze shift, and a voice command is detected by a second sensor while the indicator is in a vicinity of the object. In response to detecting the voice command, an action corresponding to the voice command may be executed.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng
recordid cdi_epo_espacenet_AU2020473277A9
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
PHYSICS
title METHODS AND SYSTEMS OF EXTENDED REALITY ENVIRONMENT INTERACTION BASED ON EYE MOTIONS
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-06T01%3A54%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=PULIKUNTA,%20Sai%20Durga%20Venkat%20Reddy&rft.date=2024-10-03&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EAU2020473277A9%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true