Motion Capture and Character Synthesis
In some examples, a computing device can determine synthetic meshes based on source meshes of a source mesh sequence and target meshes of a target mesh sequence. The computing device can then place the respective synthetic meshes based at least in part on a rigid transformation to define a processor...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Hoppe Hugues H Collet Romea Alvaro Chuang Ming Prada Nino Fabian Andres |
description | In some examples, a computing device can determine synthetic meshes based on source meshes of a source mesh sequence and target meshes of a target mesh sequence. The computing device can then place the respective synthetic meshes based at least in part on a rigid transformation to define a processor-generated character. For example, the computing device can determine subsets of the mesh sequences based on a similarity criterion. The computing device can determine modified first and second meshes having a connectivity corresponding to a reference mesh. The computing device can then determine the synthetic meshes based on the modified first and second meshes. In some examples, the computing device can project source and target textures onto the synthetic mesh to provide projected source and target textures. The computing device can determine a synthetic texture registered to the synthetic mesh based on the projected source and target textures. |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2018012407A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2018012407A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2018012407A13</originalsourceid><addsrcrecordid>eNrjZFDzzS_JzM9TcE4sKCktSlVIzEtRcM5ILEpMLkktUgiuzCvJSC3OLOZhYE1LzClO5YXS3AzKbq4hzh66qQX58anFBYnJqXmpJfGhwUYGhhYGhkYmBuaOhsbEqQIA96AoRw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Motion Capture and Character Synthesis</title><source>esp@cenet</source><creator>Hoppe Hugues H ; Collet Romea Alvaro ; Chuang Ming ; Prada Nino Fabian Andres</creator><creatorcontrib>Hoppe Hugues H ; Collet Romea Alvaro ; Chuang Ming ; Prada Nino Fabian Andres</creatorcontrib><description>In some examples, a computing device can determine synthetic meshes based on source meshes of a source mesh sequence and target meshes of a target mesh sequence. The computing device can then place the respective synthetic meshes based at least in part on a rigid transformation to define a processor-generated character. For example, the computing device can determine subsets of the mesh sequences based on a similarity criterion. The computing device can determine modified first and second meshes having a connectivity corresponding to a reference mesh. The computing device can then determine the synthetic meshes based on the modified first and second meshes. In some examples, the computing device can project source and target textures onto the synthetic mesh to provide projected source and target textures. The computing device can determine a synthetic texture registered to the synthetic mesh based on the projected source and target textures.</description><language>eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; IMAGE DATA PROCESSING OR GENERATION, IN GENERAL ; PHYSICS</subject><creationdate>2018</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20180111&DB=EPODOC&CC=US&NR=2018012407A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,776,881,25543,76293</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20180111&DB=EPODOC&CC=US&NR=2018012407A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Hoppe Hugues H</creatorcontrib><creatorcontrib>Collet Romea Alvaro</creatorcontrib><creatorcontrib>Chuang Ming</creatorcontrib><creatorcontrib>Prada Nino Fabian Andres</creatorcontrib><title>Motion Capture and Character Synthesis</title><description>In some examples, a computing device can determine synthetic meshes based on source meshes of a source mesh sequence and target meshes of a target mesh sequence. The computing device can then place the respective synthetic meshes based at least in part on a rigid transformation to define a processor-generated character. For example, the computing device can determine subsets of the mesh sequences based on a similarity criterion. The computing device can determine modified first and second meshes having a connectivity corresponding to a reference mesh. The computing device can then determine the synthetic meshes based on the modified first and second meshes. In some examples, the computing device can project source and target textures onto the synthetic mesh to provide projected source and target textures. The computing device can determine a synthetic texture registered to the synthetic mesh based on the projected source and target textures.</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2018</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZFDzzS_JzM9TcE4sKCktSlVIzEtRcM5ILEpMLkktUgiuzCvJSC3OLOZhYE1LzClO5YXS3AzKbq4hzh66qQX58anFBYnJqXmpJfGhwUYGhhYGhkYmBuaOhsbEqQIA96AoRw</recordid><startdate>20180111</startdate><enddate>20180111</enddate><creator>Hoppe Hugues H</creator><creator>Collet Romea Alvaro</creator><creator>Chuang Ming</creator><creator>Prada Nino Fabian Andres</creator><scope>EVB</scope></search><sort><creationdate>20180111</creationdate><title>Motion Capture and Character Synthesis</title><author>Hoppe Hugues H ; Collet Romea Alvaro ; Chuang Ming ; Prada Nino Fabian Andres</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2018012407A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2018</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>IMAGE DATA PROCESSING OR GENERATION, IN GENERAL</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>Hoppe Hugues H</creatorcontrib><creatorcontrib>Collet Romea Alvaro</creatorcontrib><creatorcontrib>Chuang Ming</creatorcontrib><creatorcontrib>Prada Nino Fabian Andres</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Hoppe Hugues H</au><au>Collet Romea Alvaro</au><au>Chuang Ming</au><au>Prada Nino Fabian Andres</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Motion Capture and Character Synthesis</title><date>2018-01-11</date><risdate>2018</risdate><abstract>In some examples, a computing device can determine synthetic meshes based on source meshes of a source mesh sequence and target meshes of a target mesh sequence. The computing device can then place the respective synthetic meshes based at least in part on a rigid transformation to define a processor-generated character. For example, the computing device can determine subsets of the mesh sequences based on a similarity criterion. The computing device can determine modified first and second meshes having a connectivity corresponding to a reference mesh. The computing device can then determine the synthetic meshes based on the modified first and second meshes. In some examples, the computing device can project source and target textures onto the synthetic mesh to provide projected source and target textures. The computing device can determine a synthetic texture registered to the synthetic mesh based on the projected source and target textures.</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | eng |
recordid | cdi_epo_espacenet_US2018012407A1 |
source | esp@cenet |
subjects | CALCULATING COMPUTING COUNTING IMAGE DATA PROCESSING OR GENERATION, IN GENERAL PHYSICS |
title | Motion Capture and Character Synthesis |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-25T16%3A56%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Hoppe%20Hugues%20H&rft.date=2018-01-11&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2018012407A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |