Gaze-based object placement within a virtual reality environment
A head mounted display (HMD) device operating in a real world physical environment is configured with a sensor package that enables determination of an intersection of a device user's projected gaze with a location in a virtual reality environment so that virtual objects can be placed into the...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Sugden, Ben Salter, Tom Burns, Aaron Massey, Laura |
description | A head mounted display (HMD) device operating in a real world physical environment is configured with a sensor package that enables determination of an intersection of a device user's projected gaze with a location in a virtual reality environment so that virtual objects can be placed into the environment with high precision. Surface reconstruction of the physical environment can be applied using data from the sensor package to determine the user's view position in the virtual world. A gaze ray originating from the view position is projected outward and a cursor or similar indicator is rendered on the HMD display at the ray's closest intersection with the virtual world such as a virtual object, floor/ground, etc. In response to user input, such as a gesture, voice interaction, or control manipulation, a virtual object is placed at the point of intersection between the projected gaze ray and the virtual reality environment. |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US10416760B2</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US10416760B2</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US10416760B23</originalsourceid><addsrcrecordid>eNrjZHBwT6xK1U1KLE5NUchPykpNLlEoyElMTs1NzStRKM8sycjMU0hUKMssKilNzFEoSk3MySypVEjNA4rk54EU8TCwpiXmFKfyQmluBkU31xBnD93Ugvz41OICoFl5qSXxocGGBiaGZuZmBk5GxsSoAQDgrjG6</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Gaze-based object placement within a virtual reality environment</title><source>esp@cenet</source><creator>Sugden, Ben ; Salter, Tom ; Burns, Aaron ; Massey, Laura</creator><creatorcontrib>Sugden, Ben ; Salter, Tom ; Burns, Aaron ; Massey, Laura</creatorcontrib><description>A head mounted display (HMD) device operating in a real world physical environment is configured with a sensor package that enables determination of an intersection of a device user's projected gaze with a location in a virtual reality environment so that virtual objects can be placed into the environment with high precision. Surface reconstruction of the physical environment can be applied using data from the sensor package to determine the user's view position in the virtual world. A gaze ray originating from the view position is projected outward and a cursor or similar indicator is rendered on the HMD display at the ray's closest intersection with the virtual world such as a virtual object, floor/ground, etc. In response to user input, such as a gesture, voice interaction, or control manipulation, a virtual object is placed at the point of intersection between the projected gaze ray and the virtual reality environment.</description><language>eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; ELECTRIC COMMUNICATION TECHNIQUE ; ELECTRIC DIGITAL DATA PROCESSING ; ELECTRICITY ; IMAGE DATA PROCESSING OR GENERATION, IN GENERAL ; OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS ; OPTICS ; PHYSICS ; PICTORIAL COMMUNICATION, e.g. TELEVISION</subject><creationdate>2019</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20190917&DB=EPODOC&CC=US&NR=10416760B2$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20190917&DB=EPODOC&CC=US&NR=10416760B2$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Sugden, Ben</creatorcontrib><creatorcontrib>Salter, Tom</creatorcontrib><creatorcontrib>Burns, Aaron</creatorcontrib><creatorcontrib>Massey, Laura</creatorcontrib><title>Gaze-based object placement within a virtual reality environment</title><description>A head mounted display (HMD) device operating in a real world physical environment is configured with a sensor package that enables determination of an intersection of a device user's projected gaze with a location in a virtual reality environment so that virtual objects can be placed into the environment with high precision. Surface reconstruction of the physical environment can be applied using data from the sensor package to determine the user's view position in the virtual world. A gaze ray originating from the view position is projected outward and a cursor or similar indicator is rendered on the HMD display at the ray's closest intersection with the virtual world such as a virtual object, floor/ground, etc. In response to user input, such as a gesture, voice interaction, or control manipulation, a virtual object is placed at the point of intersection between the projected gaze ray and the virtual reality environment.</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC COMMUNICATION TECHNIQUE</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>ELECTRICITY</subject><subject>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</subject><subject>OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS</subject><subject>OPTICS</subject><subject>PHYSICS</subject><subject>PICTORIAL COMMUNICATION, e.g. TELEVISION</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2019</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZHBwT6xK1U1KLE5NUchPykpNLlEoyElMTs1NzStRKM8sycjMU0hUKMssKilNzFEoSk3MySypVEjNA4rk54EU8TCwpiXmFKfyQmluBkU31xBnD93Ugvz41OICoFl5qSXxocGGBiaGZuZmBk5GxsSoAQDgrjG6</recordid><startdate>20190917</startdate><enddate>20190917</enddate><creator>Sugden, Ben</creator><creator>Salter, Tom</creator><creator>Burns, Aaron</creator><creator>Massey, Laura</creator><scope>EVB</scope></search><sort><creationdate>20190917</creationdate><title>Gaze-based object placement within a virtual reality environment</title><author>Sugden, Ben ; Salter, Tom ; Burns, Aaron ; Massey, Laura</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US10416760B23</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2019</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC COMMUNICATION TECHNIQUE</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>ELECTRICITY</topic><topic>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</topic><topic>OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS</topic><topic>OPTICS</topic><topic>PHYSICS</topic><topic>PICTORIAL COMMUNICATION, e.g. TELEVISION</topic><toplevel>online_resources</toplevel><creatorcontrib>Sugden, Ben</creatorcontrib><creatorcontrib>Salter, Tom</creatorcontrib><creatorcontrib>Burns, Aaron</creatorcontrib><creatorcontrib>Massey, Laura</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Sugden, Ben</au><au>Salter, Tom</au><au>Burns, Aaron</au><au>Massey, Laura</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Gaze-based object placement within a virtual reality environment</title><date>2019-09-17</date><risdate>2019</risdate><abstract>A head mounted display (HMD) device operating in a real world physical environment is configured with a sensor package that enables determination of an intersection of a device user's projected gaze with a location in a virtual reality environment so that virtual objects can be placed into the environment with high precision. Surface reconstruction of the physical environment can be applied using data from the sensor package to determine the user's view position in the virtual world. A gaze ray originating from the view position is projected outward and a cursor or similar indicator is rendered on the HMD display at the ray's closest intersection with the virtual world such as a virtual object, floor/ground, etc. In response to user input, such as a gesture, voice interaction, or control manipulation, a virtual object is placed at the point of intersection between the projected gaze ray and the virtual reality environment.</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | eng |
recordid | cdi_epo_espacenet_US10416760B2 |
source | esp@cenet |
subjects | CALCULATING COMPUTING COUNTING ELECTRIC COMMUNICATION TECHNIQUE ELECTRIC DIGITAL DATA PROCESSING ELECTRICITY IMAGE DATA PROCESSING OR GENERATION, IN GENERAL OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS OPTICS PHYSICS PICTORIAL COMMUNICATION, e.g. TELEVISION |
title | Gaze-based object placement within a virtual reality environment |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T05%3A26%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Sugden,%20Ben&rft.date=2019-09-17&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS10416760B2%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |