JOINT INFRARED AND VISIBLE LIGHT VISUAL-INERTIAL OBJECT TRACKING
In one embodiment, a method for tracking includes receiving motion data captured by a motion sensor of a wearable device, generating a pose of the wearable device based on the motion data, capturing a first frame of the wearable device by a camera using a first exposure time, identifying, in the fir...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Forster, Christian Melim, Andrew |
description | In one embodiment, a method for tracking includes receiving motion data captured by a motion sensor of a wearable device, generating a pose of the wearable device based on the motion data, capturing a first frame of the wearable device by a camera using a first exposure time, identifying, in the first frame, a pattern of lights disposed on the wearable device, capturing a second frame of the wearable device by the camera using a second exposure time, identifying, in the second frame, predetermined features of the wearable device, and adjusting the pose of the wearable device in the environment based on the identified pattern of light in the first frame or the identified predetermined features in the second frame. The method utilizes the predetermined features for tracking the wearable device in a visible-light frame under specific light conditions to improve the accuracy of the pose of the controller. |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2024353920A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2024353920A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2024353920A13</originalsourceid><addsrcrecordid>eNrjZHDw8vf0C1Hw9HMLcgxydVFw9HNRCPMM9nTycVXw8XT3CAHxQh19dD39XINCPB19FPydvFydQxRCghydvT393HkYWNMSc4pTeaE0N4Oym2uIs4duakF-fGpxQWJyal5qSXxosJGBkYmxqbGlkYGjoTFxqgBMeyuC</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>JOINT INFRARED AND VISIBLE LIGHT VISUAL-INERTIAL OBJECT TRACKING</title><source>esp@cenet</source><creator>Forster, Christian ; Melim, Andrew</creator><creatorcontrib>Forster, Christian ; Melim, Andrew</creatorcontrib><description>In one embodiment, a method for tracking includes receiving motion data captured by a motion sensor of a wearable device, generating a pose of the wearable device based on the motion data, capturing a first frame of the wearable device by a camera using a first exposure time, identifying, in the first frame, a pattern of lights disposed on the wearable device, capturing a second frame of the wearable device by the camera using a second exposure time, identifying, in the second frame, predetermined features of the wearable device, and adjusting the pose of the wearable device in the environment based on the identified pattern of light in the first frame or the identified predetermined features in the second frame. The method utilizes the predetermined features for tracking the wearable device in a visible-light frame under specific light conditions to improve the accuracy of the pose of the controller.</description><language>eng</language><subject>CALCULATING ; COMPUTING ; COUNTING ; DIAGNOSIS ; ELECTRIC DIGITAL DATA PROCESSING ; HUMAN NECESSITIES ; HYGIENE ; IDENTIFICATION ; MEDICAL OR VETERINARY SCIENCE ; PHYSICS ; SURGERY</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20241024&DB=EPODOC&CC=US&NR=2024353920A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25563,76318</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20241024&DB=EPODOC&CC=US&NR=2024353920A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>Forster, Christian</creatorcontrib><creatorcontrib>Melim, Andrew</creatorcontrib><title>JOINT INFRARED AND VISIBLE LIGHT VISUAL-INERTIAL OBJECT TRACKING</title><description>In one embodiment, a method for tracking includes receiving motion data captured by a motion sensor of a wearable device, generating a pose of the wearable device based on the motion data, capturing a first frame of the wearable device by a camera using a first exposure time, identifying, in the first frame, a pattern of lights disposed on the wearable device, capturing a second frame of the wearable device by the camera using a second exposure time, identifying, in the second frame, predetermined features of the wearable device, and adjusting the pose of the wearable device in the environment based on the identified pattern of light in the first frame or the identified predetermined features in the second frame. The method utilizes the predetermined features for tracking the wearable device in a visible-light frame under specific light conditions to improve the accuracy of the pose of the controller.</description><subject>CALCULATING</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>DIAGNOSIS</subject><subject>ELECTRIC DIGITAL DATA PROCESSING</subject><subject>HUMAN NECESSITIES</subject><subject>HYGIENE</subject><subject>IDENTIFICATION</subject><subject>MEDICAL OR VETERINARY SCIENCE</subject><subject>PHYSICS</subject><subject>SURGERY</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZHDw8vf0C1Hw9HMLcgxydVFw9HNRCPMM9nTycVXw8XT3CAHxQh19dD39XINCPB19FPydvFydQxRCghydvT393HkYWNMSc4pTeaE0N4Oym2uIs4duakF-fGpxQWJyal5qSXxosJGBkYmxqbGlkYGjoTFxqgBMeyuC</recordid><startdate>20241024</startdate><enddate>20241024</enddate><creator>Forster, Christian</creator><creator>Melim, Andrew</creator><scope>EVB</scope></search><sort><creationdate>20241024</creationdate><title>JOINT INFRARED AND VISIBLE LIGHT VISUAL-INERTIAL OBJECT TRACKING</title><author>Forster, Christian ; Melim, Andrew</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2024353920A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>DIAGNOSIS</topic><topic>ELECTRIC DIGITAL DATA PROCESSING</topic><topic>HUMAN NECESSITIES</topic><topic>HYGIENE</topic><topic>IDENTIFICATION</topic><topic>MEDICAL OR VETERINARY SCIENCE</topic><topic>PHYSICS</topic><topic>SURGERY</topic><toplevel>online_resources</toplevel><creatorcontrib>Forster, Christian</creatorcontrib><creatorcontrib>Melim, Andrew</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Forster, Christian</au><au>Melim, Andrew</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>JOINT INFRARED AND VISIBLE LIGHT VISUAL-INERTIAL OBJECT TRACKING</title><date>2024-10-24</date><risdate>2024</risdate><abstract>In one embodiment, a method for tracking includes receiving motion data captured by a motion sensor of a wearable device, generating a pose of the wearable device based on the motion data, capturing a first frame of the wearable device by a camera using a first exposure time, identifying, in the first frame, a pattern of lights disposed on the wearable device, capturing a second frame of the wearable device by the camera using a second exposure time, identifying, in the second frame, predetermined features of the wearable device, and adjusting the pose of the wearable device in the environment based on the identified pattern of light in the first frame or the identified predetermined features in the second frame. The method utilizes the predetermined features for tracking the wearable device in a visible-light frame under specific light conditions to improve the accuracy of the pose of the controller.</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | eng |
recordid | cdi_epo_espacenet_US2024353920A1 |
source | esp@cenet |
subjects | CALCULATING COMPUTING COUNTING DIAGNOSIS ELECTRIC DIGITAL DATA PROCESSING HUMAN NECESSITIES HYGIENE IDENTIFICATION MEDICAL OR VETERINARY SCIENCE PHYSICS SURGERY |
title | JOINT INFRARED AND VISIBLE LIGHT VISUAL-INERTIAL OBJECT TRACKING |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-10T21%3A56%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=Forster,%20Christian&rft.date=2024-10-24&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2024353920A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |