User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments

The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Edelhart, Dave, Yu, Wilbur Yung Sheng, Holz, David S, Fox, Barrett, Medich, Jody, Hare, Gabriel A, Plemmons, Daniel, Hay, Kyle A
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Edelhart, Dave
Yu, Wilbur Yung Sheng
Holz, David S
Fox, Barrett
Medich, Jody
Hare, Gabriel A
Plemmons, Daniel
Hay, Kyle A
description The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US11599237B2</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US11599237B2</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US11599237B23</originalsourceid><addsrcrecordid>eNqNjkEKwkAMRbtxIeodxgN00RYRt4riXrsusU1LYCZTMplCb--UegAhEPL_e5BtNtcBxRArSg8tmt6v1yCg2JkBg0YBuxLQKnk2wJ1x0SrlcZFbby18fBKWktI4hxJoQjORaEy2IFjS2SCnxLND1rDPNj3YgIff3mXHx_19e-Y4-gbDmL5h1KZ-FcXpcimr87Ws_mG-4yFIhw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments</title><source>esp@cenet</source><creator>Edelhart, Dave ; Yu, Wilbur Yung Sheng ; Holz, David S ; Fox, Barrett ; Medich, Jody ; Hare, Gabriel A ; Plemmons, Daniel ; Hay, Kyle A</creator><creatorcontrib>Edelhart, Dave ; Yu, Wilbur Yung Sheng ; Holz, David S ; Fox, Barrett ; Medich, Jody ; Hare, Gabriel A ; Plemmons, Daniel ; Hay, Kyle A</creatorcontrib><description>The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.</description><language>eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; IMAGE DATA PROCESSING OR GENERATION, IN GENERAL ; PHYSICS</subject><creationdate>2023</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230307&amp;DB=EPODOC&amp;CC=US&amp;NR=11599237B2$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76290</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20230307&amp;DB=EPODOC&amp;CC=US&amp;NR=11599237B2$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Edelhart, Dave</creatorcontrib><creatorcontrib>Yu, Wilbur Yung Sheng</creatorcontrib><creatorcontrib>Holz, David S</creatorcontrib><creatorcontrib>Fox, Barrett</creatorcontrib><creatorcontrib>Medich, Jody</creatorcontrib><creatorcontrib>Hare, Gabriel A</creatorcontrib><creatorcontrib>Plemmons, Daniel</creatorcontrib><creatorcontrib>Hay, Kyle A</creatorcontrib><title>User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments</title><description>The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2023</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNjkEKwkAMRbtxIeodxgN00RYRt4riXrsusU1LYCZTMplCb--UegAhEPL_e5BtNtcBxRArSg8tmt6v1yCg2JkBg0YBuxLQKnk2wJ1x0SrlcZFbby18fBKWktI4hxJoQjORaEy2IFjS2SCnxLND1rDPNj3YgIff3mXHx_19e-Y4-gbDmL5h1KZ-FcXpcimr87Ws_mG-4yFIhw</recordid><startdate>20230307</startdate><enddate>20230307</enddate><creator>Edelhart, Dave</creator><creator>Yu, Wilbur Yung Sheng</creator><creator>Holz, David S</creator><creator>Fox, Barrett</creator><creator>Medich, Jody</creator><creator>Hare, Gabriel A</creator><creator>Plemmons, Daniel</creator><creator>Hay, Kyle A</creator><scope>EVB</scope></search><sort><creationdate>20230307</creationdate><title>User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments</title><author>Edelhart, Dave ; Yu, Wilbur Yung Sheng ; Holz, David S ; Fox, Barrett ; Medich, Jody ; Hare, Gabriel A ; Plemmons, Daniel ; Hay, Kyle A</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US11599237B23</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2023</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>Edelhart, Dave</creatorcontrib><creatorcontrib>Yu, Wilbur Yung Sheng</creatorcontrib><creatorcontrib>Holz, David S</creatorcontrib><creatorcontrib>Fox, Barrett</creatorcontrib><creatorcontrib>Medich, Jody</creatorcontrib><creatorcontrib>Hare, Gabriel A</creatorcontrib><creatorcontrib>Plemmons, Daniel</creatorcontrib><creatorcontrib>Hay, Kyle A</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Edelhart, Dave</au><au>Yu, Wilbur Yung Sheng</au><au>Holz, David S</au><au>Fox, Barrett</au><au>Medich, Jody</au><au>Hare, Gabriel A</au><au>Plemmons, Daniel</au><au>Hay, Kyle A</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments</title><date>2023-03-07</date><risdate>2023</risdate><abstract>The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng
recordid cdi_epo_espacenet_US11599237B2
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
PHYSICS
title User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T17%3A19%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Edelhart,%20Dave&rft.date=2023-03-07&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS11599237B2%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true