Hybrid inertial and vision tracking for augmented reality registration
The biggest single obstacle to building effective augmented reality (AR) systems is the lack of accurate wide-area sensors for trackers that report the locations and orientations of objects in an environment. Active (sensor-emitter) tracking technologies require powered-device installation. Limiting...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 267 |
---|---|
container_issue | |
container_start_page | 260 |
container_title | |
container_volume | |
creator | Suya You Neumann, U. Azuma, R. |
description | The biggest single obstacle to building effective augmented reality (AR) systems is the lack of accurate wide-area sensors for trackers that report the locations and orientations of objects in an environment. Active (sensor-emitter) tracking technologies require powered-device installation. Limiting their use to prepared areas that are relatively free of natural or man-made interference sources. Vision-based systems can use passive landmarks, but they are more computationally demanding and often exhibit erroneous behavior due to occlusion or numerical instability. Inertial sensors are completely passive, requiring no external devices or targets, however, the drift rates in portable strapdown configurations are too great for practical use. In this paper, we present a hybrid approach to AR tracking that integrates inertial and vision-based technologies. We exploit the complementary nature of the two technologies to compensate for the weaknesses in each component. Analysis and experimental results demonstrate this system's effectiveness. |
doi_str_mv | 10.1109/VR.1999.756960 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_756960</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>756960</ieee_id><sourcerecordid>756960</sourcerecordid><originalsourceid>FETCH-LOGICAL-i104t-8e14ce431e5b9e55ae7538b5bfe15baa258c91acb51d5d8c8aa7f2ff985224763</originalsourceid><addsrcrecordid>eNotj0tLxDAURoMPsDO6deEqf6A1j94mWcrgOMKAIOp2uGlvSrTTkTQK8-8tjKuz-A4HPsZupaikFO7-47WSzrnKQOMaccYKpQ2UoFVzzhbCNA6EcBouWCGFNaVVRlyxxTR9CjGvTV2w9eboU-x4HCnliAPHseO_cYqHkeeE7Vccex4OieNPv6cxU8cT4RDzcWYfp9nJs3vNLgMOE938c8ne149vq025fXl6Xj1syyhFnUtLsm6p1pLAOwJAMqCtBx9IgkdUYFsnsfUgO-hsaxFNUCE4C0rVptFLdnfqRiLafae4x3Tcne7rP-vgTWk</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Hybrid inertial and vision tracking for augmented reality registration</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Suya You ; Neumann, U. ; Azuma, R.</creator><creatorcontrib>Suya You ; Neumann, U. ; Azuma, R.</creatorcontrib><description>The biggest single obstacle to building effective augmented reality (AR) systems is the lack of accurate wide-area sensors for trackers that report the locations and orientations of objects in an environment. Active (sensor-emitter) tracking technologies require powered-device installation. Limiting their use to prepared areas that are relatively free of natural or man-made interference sources. Vision-based systems can use passive landmarks, but they are more computationally demanding and often exhibit erroneous behavior due to occlusion or numerical instability. Inertial sensors are completely passive, requiring no external devices or targets, however, the drift rates in portable strapdown configurations are too great for practical use. In this paper, we present a hybrid approach to AR tracking that integrates inertial and vision-based technologies. We exploit the complementary nature of the two technologies to compensate for the weaknesses in each component. Analysis and experimental results demonstrate this system's effectiveness.</description><identifier>ISSN: 1087-8270</identifier><identifier>ISBN: 0769500935</identifier><identifier>ISBN: 9780769500935</identifier><identifier>EISSN: 2375-5326</identifier><identifier>DOI: 10.1109/VR.1999.756960</identifier><language>eng</language><publisher>IEEE</publisher><subject>Acceleration ; Augmented reality ; Cameras ; Computer vision ; Interference ; Machine vision ; Magnetic sensors ; Position measurement ; Robustness ; Sensor systems</subject><ispartof>Proceedings IEEE Virtual Reality (Cat. No. 99CB36316), 1999, p.260-267</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/756960$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,4050,4051,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/756960$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Suya You</creatorcontrib><creatorcontrib>Neumann, U.</creatorcontrib><creatorcontrib>Azuma, R.</creatorcontrib><title>Hybrid inertial and vision tracking for augmented reality registration</title><title>Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)</title><addtitle>VR</addtitle><description>The biggest single obstacle to building effective augmented reality (AR) systems is the lack of accurate wide-area sensors for trackers that report the locations and orientations of objects in an environment. Active (sensor-emitter) tracking technologies require powered-device installation. Limiting their use to prepared areas that are relatively free of natural or man-made interference sources. Vision-based systems can use passive landmarks, but they are more computationally demanding and often exhibit erroneous behavior due to occlusion or numerical instability. Inertial sensors are completely passive, requiring no external devices or targets, however, the drift rates in portable strapdown configurations are too great for practical use. In this paper, we present a hybrid approach to AR tracking that integrates inertial and vision-based technologies. We exploit the complementary nature of the two technologies to compensate for the weaknesses in each component. Analysis and experimental results demonstrate this system's effectiveness.</description><subject>Acceleration</subject><subject>Augmented reality</subject><subject>Cameras</subject><subject>Computer vision</subject><subject>Interference</subject><subject>Machine vision</subject><subject>Magnetic sensors</subject><subject>Position measurement</subject><subject>Robustness</subject><subject>Sensor systems</subject><issn>1087-8270</issn><issn>2375-5326</issn><isbn>0769500935</isbn><isbn>9780769500935</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>1999</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNotj0tLxDAURoMPsDO6deEqf6A1j94mWcrgOMKAIOp2uGlvSrTTkTQK8-8tjKuz-A4HPsZupaikFO7-47WSzrnKQOMaccYKpQ2UoFVzzhbCNA6EcBouWCGFNaVVRlyxxTR9CjGvTV2w9eboU-x4HCnliAPHseO_cYqHkeeE7Vccex4OieNPv6cxU8cT4RDzcWYfp9nJs3vNLgMOE938c8ne149vq025fXl6Xj1syyhFnUtLsm6p1pLAOwJAMqCtBx9IgkdUYFsnsfUgO-hsaxFNUCE4C0rVptFLdnfqRiLafae4x3Tcne7rP-vgTWk</recordid><startdate>1999</startdate><enddate>1999</enddate><creator>Suya You</creator><creator>Neumann, U.</creator><creator>Azuma, R.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>1999</creationdate><title>Hybrid inertial and vision tracking for augmented reality registration</title><author>Suya You ; Neumann, U. ; Azuma, R.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i104t-8e14ce431e5b9e55ae7538b5bfe15baa258c91acb51d5d8c8aa7f2ff985224763</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>1999</creationdate><topic>Acceleration</topic><topic>Augmented reality</topic><topic>Cameras</topic><topic>Computer vision</topic><topic>Interference</topic><topic>Machine vision</topic><topic>Magnetic sensors</topic><topic>Position measurement</topic><topic>Robustness</topic><topic>Sensor systems</topic><toplevel>online_resources</toplevel><creatorcontrib>Suya You</creatorcontrib><creatorcontrib>Neumann, U.</creatorcontrib><creatorcontrib>Azuma, R.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Suya You</au><au>Neumann, U.</au><au>Azuma, R.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Hybrid inertial and vision tracking for augmented reality registration</atitle><btitle>Proceedings IEEE Virtual Reality (Cat. No. 99CB36316)</btitle><stitle>VR</stitle><date>1999</date><risdate>1999</risdate><spage>260</spage><epage>267</epage><pages>260-267</pages><issn>1087-8270</issn><eissn>2375-5326</eissn><isbn>0769500935</isbn><isbn>9780769500935</isbn><abstract>The biggest single obstacle to building effective augmented reality (AR) systems is the lack of accurate wide-area sensors for trackers that report the locations and orientations of objects in an environment. Active (sensor-emitter) tracking technologies require powered-device installation. Limiting their use to prepared areas that are relatively free of natural or man-made interference sources. Vision-based systems can use passive landmarks, but they are more computationally demanding and often exhibit erroneous behavior due to occlusion or numerical instability. Inertial sensors are completely passive, requiring no external devices or targets, however, the drift rates in portable strapdown configurations are too great for practical use. In this paper, we present a hybrid approach to AR tracking that integrates inertial and vision-based technologies. We exploit the complementary nature of the two technologies to compensate for the weaknesses in each component. Analysis and experimental results demonstrate this system's effectiveness.</abstract><pub>IEEE</pub><doi>10.1109/VR.1999.756960</doi><tpages>8</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1087-8270 |
ispartof | Proceedings IEEE Virtual Reality (Cat. No. 99CB36316), 1999, p.260-267 |
issn | 1087-8270 2375-5326 |
language | eng |
recordid | cdi_ieee_primary_756960 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Acceleration Augmented reality Cameras Computer vision Interference Machine vision Magnetic sensors Position measurement Robustness Sensor systems |
title | Hybrid inertial and vision tracking for augmented reality registration |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T01%3A34%3A26IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Hybrid%20inertial%20and%20vision%20tracking%20for%20augmented%20reality%20registration&rft.btitle=Proceedings%20IEEE%20Virtual%20Reality%20(Cat.%20No.%2099CB36316)&rft.au=Suya%20You&rft.date=1999&rft.spage=260&rft.epage=267&rft.pages=260-267&rft.issn=1087-8270&rft.eissn=2375-5326&rft.isbn=0769500935&rft.isbn_list=9780769500935&rft_id=info:doi/10.1109/VR.1999.756960&rft_dat=%3Cieee_6IE%3E756960%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=756960&rfr_iscdi=true |