Live tracking and mapping from both general and rotation-only camera motion
We present an approach to real-time tracking and mapping that supports any type of camera motion in 3D environments, that is, general (parallax-inducing) as well as rotation-only (degenerate) motions. Our approach effectively generalizes both a panorama mapping and tracking system and a keyframe-bas...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 22 |
---|---|
container_issue | |
container_start_page | 13 |
container_title | |
container_volume | |
creator | Gauglitz, S. Sweeney, C. Ventura, J. Turk, M. Hollerer, T. |
description | We present an approach to real-time tracking and mapping that supports any type of camera motion in 3D environments, that is, general (parallax-inducing) as well as rotation-only (degenerate) motions. Our approach effectively generalizes both a panorama mapping and tracking system and a keyframe-based Simultaneous Localization and Mapping (SLAM) system, behaving like one or the other depending on the camera movement. It seamlessly switches between the two and is thus able to track and map through arbitrary sequences of general and rotation-only camera movements. Key elements of our approach are to design each system component such that it is compatible with both panoramic data and Structure-from-Motion data, and the use of the `Geometric Robust Information Criterion' to decide whether the transformation between a given pair of frames can best be modeled with an essential matrix E, or with a homography H. Further key features are that no separate initialization step is needed, that the reconstruction is unbiased, and that the system continues to collect and map data after tracking failure, thus creating separate tracks which are later merged if they overlap. The latter is in contrast to most existing tracking and mapping systems, which suspend tracking and mapping, thus discarding valuable data, while trying to relocalize the camera with respect to the initial map. We tested our system on a variety of video sequences, successfully tracking through different camera motions and fully automatically building panoramas as well as 3D structures. |
doi_str_mv | 10.1109/ISMAR.2012.6402532 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_6402532</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6402532</ieee_id><sourcerecordid>6402532</sourcerecordid><originalsourceid>FETCH-LOGICAL-i285t-dbef3238fb5a68b9195542a5a751615ba8a1009e9fc11acd78232b69ba497ce23</originalsourceid><addsrcrecordid>eNpVkMtOwzAURI0QEqjkB2DjH0jw9Sv2sqp4VAQh8VhX14lTDEkcORFS_54WumE1M2ekWQwhV8AKAGZv1q9Py5eCM-CFlowrwU9IZksDUpdCag3s9F9m4pxk0_TJGAO2B0ZekMcqfHs6J6y_wrClODS0x3E8-DbFnro4f9CtH3zC7rdNccY5xCGPQ7ejNfb7hvbxgC7JWYvd5LOjLsj73e3b6iGvnu_Xq2WVB27UnDfOt4IL0zqF2jgLVinJUWGpQINyaBAYs962NQDWTWm44E5bh9KWtediQa7_doP3fjOm0GPabY4XiB-ak09N</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Live tracking and mapping from both general and rotation-only camera motion</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Gauglitz, S. ; Sweeney, C. ; Ventura, J. ; Turk, M. ; Hollerer, T.</creator><creatorcontrib>Gauglitz, S. ; Sweeney, C. ; Ventura, J. ; Turk, M. ; Hollerer, T.</creatorcontrib><description>We present an approach to real-time tracking and mapping that supports any type of camera motion in 3D environments, that is, general (parallax-inducing) as well as rotation-only (degenerate) motions. Our approach effectively generalizes both a panorama mapping and tracking system and a keyframe-based Simultaneous Localization and Mapping (SLAM) system, behaving like one or the other depending on the camera movement. It seamlessly switches between the two and is thus able to track and map through arbitrary sequences of general and rotation-only camera movements. Key elements of our approach are to design each system component such that it is compatible with both panoramic data and Structure-from-Motion data, and the use of the `Geometric Robust Information Criterion' to decide whether the transformation between a given pair of frames can best be modeled with an essential matrix E, or with a homography H. Further key features are that no separate initialization step is needed, that the reconstruction is unbiased, and that the system continues to collect and map data after tracking failure, thus creating separate tracks which are later merged if they overlap. The latter is in contrast to most existing tracking and mapping systems, which suspend tracking and mapping, thus discarding valuable data, while trying to relocalize the camera with respect to the initial map. We tested our system on a variety of video sequences, successfully tracking through different camera motions and fully automatically building panoramas as well as 3D structures.</description><identifier>ISBN: 9781467346603</identifier><identifier>ISBN: 1467346608</identifier><identifier>EISBN: 9781467346610</identifier><identifier>EISBN: 9781467346627</identifier><identifier>EISBN: 1467346616</identifier><identifier>EISBN: 1467346624</identifier><identifier>DOI: 10.1109/ISMAR.2012.6402532</identifier><language>eng</language><publisher>IEEE</publisher><subject>Cameras ; Data models ; Merging ; Real-time systems ; Robustness ; Simultaneous localization and mapping ; Tracking</subject><ispartof>2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2012, p.13-22</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6402532$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6402532$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Gauglitz, S.</creatorcontrib><creatorcontrib>Sweeney, C.</creatorcontrib><creatorcontrib>Ventura, J.</creatorcontrib><creatorcontrib>Turk, M.</creatorcontrib><creatorcontrib>Hollerer, T.</creatorcontrib><title>Live tracking and mapping from both general and rotation-only camera motion</title><title>2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)</title><addtitle>ISMAR</addtitle><description>We present an approach to real-time tracking and mapping that supports any type of camera motion in 3D environments, that is, general (parallax-inducing) as well as rotation-only (degenerate) motions. Our approach effectively generalizes both a panorama mapping and tracking system and a keyframe-based Simultaneous Localization and Mapping (SLAM) system, behaving like one or the other depending on the camera movement. It seamlessly switches between the two and is thus able to track and map through arbitrary sequences of general and rotation-only camera movements. Key elements of our approach are to design each system component such that it is compatible with both panoramic data and Structure-from-Motion data, and the use of the `Geometric Robust Information Criterion' to decide whether the transformation between a given pair of frames can best be modeled with an essential matrix E, or with a homography H. Further key features are that no separate initialization step is needed, that the reconstruction is unbiased, and that the system continues to collect and map data after tracking failure, thus creating separate tracks which are later merged if they overlap. The latter is in contrast to most existing tracking and mapping systems, which suspend tracking and mapping, thus discarding valuable data, while trying to relocalize the camera with respect to the initial map. We tested our system on a variety of video sequences, successfully tracking through different camera motions and fully automatically building panoramas as well as 3D structures.</description><subject>Cameras</subject><subject>Data models</subject><subject>Merging</subject><subject>Real-time systems</subject><subject>Robustness</subject><subject>Simultaneous localization and mapping</subject><subject>Tracking</subject><isbn>9781467346603</isbn><isbn>1467346608</isbn><isbn>9781467346610</isbn><isbn>9781467346627</isbn><isbn>1467346616</isbn><isbn>1467346624</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2012</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNpVkMtOwzAURI0QEqjkB2DjH0jw9Sv2sqp4VAQh8VhX14lTDEkcORFS_54WumE1M2ekWQwhV8AKAGZv1q9Py5eCM-CFlowrwU9IZksDUpdCag3s9F9m4pxk0_TJGAO2B0ZekMcqfHs6J6y_wrClODS0x3E8-DbFnro4f9CtH3zC7rdNccY5xCGPQ7ejNfb7hvbxgC7JWYvd5LOjLsj73e3b6iGvnu_Xq2WVB27UnDfOt4IL0zqF2jgLVinJUWGpQINyaBAYs962NQDWTWm44E5bh9KWtediQa7_doP3fjOm0GPabY4XiB-ak09N</recordid><startdate>201211</startdate><enddate>201211</enddate><creator>Gauglitz, S.</creator><creator>Sweeney, C.</creator><creator>Ventura, J.</creator><creator>Turk, M.</creator><creator>Hollerer, T.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201211</creationdate><title>Live tracking and mapping from both general and rotation-only camera motion</title><author>Gauglitz, S. ; Sweeney, C. ; Ventura, J. ; Turk, M. ; Hollerer, T.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i285t-dbef3238fb5a68b9195542a5a751615ba8a1009e9fc11acd78232b69ba497ce23</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Cameras</topic><topic>Data models</topic><topic>Merging</topic><topic>Real-time systems</topic><topic>Robustness</topic><topic>Simultaneous localization and mapping</topic><topic>Tracking</topic><toplevel>online_resources</toplevel><creatorcontrib>Gauglitz, S.</creatorcontrib><creatorcontrib>Sweeney, C.</creatorcontrib><creatorcontrib>Ventura, J.</creatorcontrib><creatorcontrib>Turk, M.</creatorcontrib><creatorcontrib>Hollerer, T.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Gauglitz, S.</au><au>Sweeney, C.</au><au>Ventura, J.</au><au>Turk, M.</au><au>Hollerer, T.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Live tracking and mapping from both general and rotation-only camera motion</atitle><btitle>2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)</btitle><stitle>ISMAR</stitle><date>2012-11</date><risdate>2012</risdate><spage>13</spage><epage>22</epage><pages>13-22</pages><isbn>9781467346603</isbn><isbn>1467346608</isbn><eisbn>9781467346610</eisbn><eisbn>9781467346627</eisbn><eisbn>1467346616</eisbn><eisbn>1467346624</eisbn><abstract>We present an approach to real-time tracking and mapping that supports any type of camera motion in 3D environments, that is, general (parallax-inducing) as well as rotation-only (degenerate) motions. Our approach effectively generalizes both a panorama mapping and tracking system and a keyframe-based Simultaneous Localization and Mapping (SLAM) system, behaving like one or the other depending on the camera movement. It seamlessly switches between the two and is thus able to track and map through arbitrary sequences of general and rotation-only camera movements. Key elements of our approach are to design each system component such that it is compatible with both panoramic data and Structure-from-Motion data, and the use of the `Geometric Robust Information Criterion' to decide whether the transformation between a given pair of frames can best be modeled with an essential matrix E, or with a homography H. Further key features are that no separate initialization step is needed, that the reconstruction is unbiased, and that the system continues to collect and map data after tracking failure, thus creating separate tracks which are later merged if they overlap. The latter is in contrast to most existing tracking and mapping systems, which suspend tracking and mapping, thus discarding valuable data, while trying to relocalize the camera with respect to the initial map. We tested our system on a variety of video sequences, successfully tracking through different camera motions and fully automatically building panoramas as well as 3D structures.</abstract><pub>IEEE</pub><doi>10.1109/ISMAR.2012.6402532</doi><tpages>10</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISBN: 9781467346603 |
ispartof | 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2012, p.13-22 |
issn | |
language | eng |
recordid | cdi_ieee_primary_6402532 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Cameras Data models Merging Real-time systems Robustness Simultaneous localization and mapping Tracking |
title | Live tracking and mapping from both general and rotation-only camera motion |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T20%3A46%3A47IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Live%20tracking%20and%20mapping%20from%20both%20general%20and%20rotation-only%20camera%20motion&rft.btitle=2012%20IEEE%20International%20Symposium%20on%20Mixed%20and%20Augmented%20Reality%20(ISMAR)&rft.au=Gauglitz,%20S.&rft.date=2012-11&rft.spage=13&rft.epage=22&rft.pages=13-22&rft.isbn=9781467346603&rft.isbn_list=1467346608&rft_id=info:doi/10.1109/ISMAR.2012.6402532&rft_dat=%3Cieee_6IE%3E6402532%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=9781467346610&rft.eisbn_list=9781467346627&rft.eisbn_list=1467346616&rft.eisbn_list=1467346624&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=6402532&rfr_iscdi=true |