AR Interactions and Experiences

In some implementations, the disclosed systems and methods can detect an interaction with respect to a set of virtual objects, which can start with a particular gesture, and take an action with respect to one or more virtual objects based on a further interaction (e.g., holding the gesture for a par...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: PLA I CONESA, Pol, CHOO, Amber, ROSAS, Daniel, GARCIA PUYOL, Ana, FAUCHER, Aaron, ASCHENBACH, Nathan, FUSTE LLEIXA, Anna, IBARS MARTINEZ, Roger, LEE, Hae Jin, MA, Jing
Format: Patent
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator PLA I CONESA, Pol
CHOO, Amber
ROSAS, Daniel
GARCIA PUYOL, Ana
FAUCHER, Aaron
ASCHENBACH, Nathan
FUSTE LLEIXA, Anna
IBARS MARTINEZ, Roger
LEE, Hae Jin
MA, Jing
description In some implementations, the disclosed systems and methods can detect an interaction with respect to a set of virtual objects, which can start with a particular gesture, and take an action with respect to one or more virtual objects based on a further interaction (e.g., holding the gesture for a particular amount of time, moving the gesture in a particular direction, releasing the gesture, etc.).In some implementations, the disclosed systems and methods can automatically review a 3D video to determine a depicted user or avatar movement pattern (e.g., dance moves, repair procedure, playing an instrument, etc.).In some implementations, the disclosed systems and methods can allow the gesture to included a flat hand with the user's thumb next to the palm, with the gesture toward the user's face.
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2024104870A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2024104870A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2024104870A13</originalsourceid><addsrcrecordid>eNrjZJB3DFLwzCtJLUpMLsnMzytWSMxLUXCtKEgtykzNS04t5mFgTUvMKU7lhdLcDMpuriHOHrqpBfnxqcUFicmpeakl8aHBRgZGJoYGJhbmBo6GxsSpAgCWwyWG</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>AR Interactions and Experiences</title><source>esp@cenet</source><creator>PLA I CONESA, Pol ; CHOO, Amber ; ROSAS, Daniel ; GARCIA PUYOL, Ana ; FAUCHER, Aaron ; ASCHENBACH, Nathan ; FUSTE LLEIXA, Anna ; IBARS MARTINEZ, Roger ; LEE, Hae Jin ; MA, Jing</creator><creatorcontrib>PLA I CONESA, Pol ; CHOO, Amber ; ROSAS, Daniel ; GARCIA PUYOL, Ana ; FAUCHER, Aaron ; ASCHENBACH, Nathan ; FUSTE LLEIXA, Anna ; IBARS MARTINEZ, Roger ; LEE, Hae Jin ; MA, Jing</creatorcontrib><description>In some implementations, the disclosed systems and methods can detect an interaction with respect to a set of virtual objects, which can start with a particular gesture, and take an action with respect to one or more virtual objects based on a further interaction (e.g., holding the gesture for a particular amount of time, moving the gesture in a particular direction, releasing the gesture, etc.).In some implementations, the disclosed systems and methods can automatically review a 3D video to determine a depicted user or avatar movement pattern (e.g., dance moves, repair procedure, playing an instrument, etc.).In some implementations, the disclosed systems and methods can allow the gesture to included a flat hand with the user's thumb next to the palm, with the gesture toward the user's face.</description><language>eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; ELECTRIC DIGITAL DATA PROCESSING ; IMAGE DATA PROCESSING OR GENERATION, IN GENERAL ; PHYSICS</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20240328&amp;DB=EPODOC&amp;CC=US&amp;NR=2024104870A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25542,76289</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20240328&amp;DB=EPODOC&amp;CC=US&amp;NR=2024104870A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>PLA I CONESA, Pol</creatorcontrib><creatorcontrib>CHOO, Amber</creatorcontrib><creatorcontrib>ROSAS, Daniel</creatorcontrib><creatorcontrib>GARCIA PUYOL, Ana</creatorcontrib><creatorcontrib>FAUCHER, Aaron</creatorcontrib><creatorcontrib>ASCHENBACH, Nathan</creatorcontrib><creatorcontrib>FUSTE LLEIXA, Anna</creatorcontrib><creatorcontrib>IBARS MARTINEZ, Roger</creatorcontrib><creatorcontrib>LEE, Hae Jin</creatorcontrib><creatorcontrib>MA, Jing</creatorcontrib><title>AR Interactions and Experiences</title><description>In some implementations, the disclosed systems and methods can detect an interaction with respect to a set of virtual objects, which can start with a particular gesture, and take an action with respect to one or more virtual objects based on a further interaction (e.g., holding the gesture for a particular amount of time, moving the gesture in a particular direction, releasing the gesture, etc.).In some implementations, the disclosed systems and methods can automatically review a 3D video to determine a depicted user or avatar movement pattern (e.g., dance moves, repair procedure, playing an instrument, etc.).In some implementations, the disclosed systems and methods can allow the gesture to included a flat hand with the user's thumb next to the palm, with the gesture toward the user's face.</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZJB3DFLwzCtJLUpMLsnMzytWSMxLUXCtKEgtykzNS04t5mFgTUvMKU7lhdLcDMpuriHOHrqpBfnxqcUFicmpeakl8aHBRgZGJoYGJhbmBo6GxsSpAgCWwyWG</recordid><startdate>20240328</startdate><enddate>20240328</enddate><creator>PLA I CONESA, Pol</creator><creator>CHOO, Amber</creator><creator>ROSAS, Daniel</creator><creator>GARCIA PUYOL, Ana</creator><creator>FAUCHER, Aaron</creator><creator>ASCHENBACH, Nathan</creator><creator>FUSTE LLEIXA, Anna</creator><creator>IBARS MARTINEZ, Roger</creator><creator>LEE, Hae Jin</creator><creator>MA, Jing</creator><scope>EVB</scope></search><sort><creationdate>20240328</creationdate><title>AR Interactions and Experiences</title><author>PLA I CONESA, Pol ; CHOO, Amber ; ROSAS, Daniel ; GARCIA PUYOL, Ana ; FAUCHER, Aaron ; ASCHENBACH, Nathan ; FUSTE LLEIXA, Anna ; IBARS MARTINEZ, Roger ; LEE, Hae Jin ; MA, Jing</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2024104870A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>PLA I CONESA, Pol</creatorcontrib><creatorcontrib>CHOO, Amber</creatorcontrib><creatorcontrib>ROSAS, Daniel</creatorcontrib><creatorcontrib>GARCIA PUYOL, Ana</creatorcontrib><creatorcontrib>FAUCHER, Aaron</creatorcontrib><creatorcontrib>ASCHENBACH, Nathan</creatorcontrib><creatorcontrib>FUSTE LLEIXA, Anna</creatorcontrib><creatorcontrib>IBARS MARTINEZ, Roger</creatorcontrib><creatorcontrib>LEE, Hae Jin</creatorcontrib><creatorcontrib>MA, Jing</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>PLA I CONESA, Pol</au><au>CHOO, Amber</au><au>ROSAS, Daniel</au><au>GARCIA PUYOL, Ana</au><au>FAUCHER, Aaron</au><au>ASCHENBACH, Nathan</au><au>FUSTE LLEIXA, Anna</au><au>IBARS MARTINEZ, Roger</au><au>LEE, Hae Jin</au><au>MA, Jing</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>AR Interactions and Experiences</title><date>2024-03-28</date><risdate>2024</risdate><abstract>In some implementations, the disclosed systems and methods can detect an interaction with respect to a set of virtual objects, which can start with a particular gesture, and take an action with respect to one or more virtual objects based on a further interaction (e.g., holding the gesture for a particular amount of time, moving the gesture in a particular direction, releasing the gesture, etc.).In some implementations, the disclosed systems and methods can automatically review a 3D video to determine a depicted user or avatar movement pattern (e.g., dance moves, repair procedure, playing an instrument, etc.).In some implementations, the disclosed systems and methods can allow the gesture to included a flat hand with the user's thumb next to the palm, with the gesture toward the user's face.</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language eng
recordid cdi_epo_espacenet_US2024104870A1
source esp@cenet
subjects CALCULATING
COMPUTING
COUNTING
ELECTRIC DIGITAL DATA PROCESSING
IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
PHYSICS
title AR Interactions and Experiences
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-07T01%3A05%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=PLA%20I%20CONESA,%20Pol&rft.date=2024-03-28&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2024104870A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true