Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality

The contribution of this paper is two-fold. First, we show how to extend the ESM algorithm to handle motion blur in 3D object tracking. ESM is a powerful algorithm for template matching-based tracking, but it can fail under motion blur. We introduce an image formation model that explicitly consider...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics 2012-09, Vol.18 (9), p.1449-1459
Hauptverfasser: Youngmin Park, Lepetit, V., Woontack Woo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1459
container_issue 9
container_start_page 1449
container_title IEEE transactions on visualization and computer graphics
container_volume 18
creator Youngmin Park
Lepetit, V.
Woontack Woo
description The contribution of this paper is two-fold. First, we show how to extend the ESM algorithm to handle motion blur in 3D object tracking. ESM is a powerful algorithm for template matching-based tracking, but it can fail under motion blur. We introduce an image formation model that explicitly consider the possibility of blur, and shows its results in a generalization of the original ESM algorithm. This allows to converge faster, more accurately and more robustly even under large amount of blur. Our second contribution is an efficient method for rendering the virtual objects under the estimated motion blur. It renders two images of the object under 3D perspective, and warps them to create many intermediate images. By fusing these images we obtain a final image for the virtual objects blurred consistently with the captured image. Because warping is much faster than 3D rendering, we can create realistically blurred images at a very low computational cost.
doi_str_mv 10.1109/TVCG.2011.158
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_miscellaneous_1038234143</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6025351</ieee_id><sourcerecordid>2714515821</sourcerecordid><originalsourceid>FETCH-LOGICAL-c374t-ad7b9ab4d0cf08c0b1518867a77ba21c0e0e7e4bb817cbd24106aec09a183d4c3</originalsourceid><addsrcrecordid>eNqF0T1PwzAQBmALgShfIxMSisTCknJnO7YzQoEWCYSECmvkONcqJU3ASQb-PY4KHViYfL57dJL9MnaKMEaE9Gr-NpmOOSCOMTE77ABTiTEkoHZDDVrHXHE1YodtuwJAKU26z0YcU4Go5QGbzmxdVGW9jJ6armzq-KbqfVTWkbiN5t6692EUSPRCdUF-uC0aH133yzXVHQ19W5Xd1zHbW9iqpZOf84i93t_NJ7P48Xn6MLl-jJ3QsottofPU5rIAtwDjIMcEjVHaap1bjg4ISJPMc4Pa5QWXCMqSg9SiEYV04ohdbvZ--Oazp7bL1mXrqKpsTU3fZqgRE4lGqv8pCMOFRCkCvfhDV03v6_CQoLgyCaYKg4o3yvmmbT0tsg9frq3_CigbwsiGMLIhjCyEEfz5z9Y-X1Ox1b-_H8DZBpREtB0r4IlIUHwDs2eK2A</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1026851961</pqid></control><display><type>article</type><title>Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality</title><source>IEEE Electronic Library (IEL)</source><creator>Youngmin Park ; Lepetit, V. ; Woontack Woo</creator><creatorcontrib>Youngmin Park ; Lepetit, V. ; Woontack Woo</creatorcontrib><description>The contribution of this paper is two-fold. First, we show how to extend the ESM algorithm to handle motion blur in 3D object tracking. ESM is a powerful algorithm for template matching-based tracking, but it can fail under motion blur. We introduce an image formation model that explicitly consider the possibility of blur, and shows its results in a generalization of the original ESM algorithm. This allows to converge faster, more accurately and more robustly even under large amount of blur. Our second contribution is an efficient method for rendering the virtual objects under the estimated motion blur. It renders two images of the object under 3D perspective, and warps them to create many intermediate images. By fusing these images we obtain a final image for the virtual objects blurred consistently with the captured image. Because warping is much faster than 3D rendering, we can create realistically blurred images at a very low computational cost.</description><identifier>ISSN: 1077-2626</identifier><identifier>EISSN: 1941-0506</identifier><identifier>DOI: 10.1109/TVCG.2011.158</identifier><identifier>PMID: 21931174</identifier><identifier>CODEN: ITVGEA</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Algorithms ; Augmented reality ; Blurred ; Cameras ; Computational efficiency ; Computational modeling ; computer vision ; efficient second-order minimization ; Jacobian matrices ; motion-blur ; object detection ; object tracking ; Rendering ; Rendering (computer graphics) ; Robustness ; Studies ; Three dimensional ; Three dimensional displays ; Tracking</subject><ispartof>IEEE transactions on visualization and computer graphics, 2012-09, Vol.18 (9), p.1449-1459</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Sep 2012</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c374t-ad7b9ab4d0cf08c0b1518867a77ba21c0e0e7e4bb817cbd24106aec09a183d4c3</citedby><cites>FETCH-LOGICAL-c374t-ad7b9ab4d0cf08c0b1518867a77ba21c0e0e7e4bb817cbd24106aec09a183d4c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6025351$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6025351$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/21931174$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Youngmin Park</creatorcontrib><creatorcontrib>Lepetit, V.</creatorcontrib><creatorcontrib>Woontack Woo</creatorcontrib><title>Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality</title><title>IEEE transactions on visualization and computer graphics</title><addtitle>TVCG</addtitle><addtitle>IEEE Trans Vis Comput Graph</addtitle><description>The contribution of this paper is two-fold. First, we show how to extend the ESM algorithm to handle motion blur in 3D object tracking. ESM is a powerful algorithm for template matching-based tracking, but it can fail under motion blur. We introduce an image formation model that explicitly consider the possibility of blur, and shows its results in a generalization of the original ESM algorithm. This allows to converge faster, more accurately and more robustly even under large amount of blur. Our second contribution is an efficient method for rendering the virtual objects under the estimated motion blur. It renders two images of the object under 3D perspective, and warps them to create many intermediate images. By fusing these images we obtain a final image for the virtual objects blurred consistently with the captured image. Because warping is much faster than 3D rendering, we can create realistically blurred images at a very low computational cost.</description><subject>Algorithms</subject><subject>Augmented reality</subject><subject>Blurred</subject><subject>Cameras</subject><subject>Computational efficiency</subject><subject>Computational modeling</subject><subject>computer vision</subject><subject>efficient second-order minimization</subject><subject>Jacobian matrices</subject><subject>motion-blur</subject><subject>object detection</subject><subject>object tracking</subject><subject>Rendering</subject><subject>Rendering (computer graphics)</subject><subject>Robustness</subject><subject>Studies</subject><subject>Three dimensional</subject><subject>Three dimensional displays</subject><subject>Tracking</subject><issn>1077-2626</issn><issn>1941-0506</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2012</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNqF0T1PwzAQBmALgShfIxMSisTCknJnO7YzQoEWCYSECmvkONcqJU3ASQb-PY4KHViYfL57dJL9MnaKMEaE9Gr-NpmOOSCOMTE77ABTiTEkoHZDDVrHXHE1YodtuwJAKU26z0YcU4Go5QGbzmxdVGW9jJ6armzq-KbqfVTWkbiN5t6692EUSPRCdUF-uC0aH133yzXVHQ19W5Xd1zHbW9iqpZOf84i93t_NJ7P48Xn6MLl-jJ3QsottofPU5rIAtwDjIMcEjVHaap1bjg4ISJPMc4Pa5QWXCMqSg9SiEYV04ohdbvZ--Oazp7bL1mXrqKpsTU3fZqgRE4lGqv8pCMOFRCkCvfhDV03v6_CQoLgyCaYKg4o3yvmmbT0tsg9frq3_CigbwsiGMLIhjCyEEfz5z9Y-X1Ox1b-_H8DZBpREtB0r4IlIUHwDs2eK2A</recordid><startdate>20120901</startdate><enddate>20120901</enddate><creator>Youngmin Park</creator><creator>Lepetit, V.</creator><creator>Woontack Woo</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>F28</scope><scope>FR3</scope><scope>7X8</scope></search><sort><creationdate>20120901</creationdate><title>Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality</title><author>Youngmin Park ; Lepetit, V. ; Woontack Woo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c374t-ad7b9ab4d0cf08c0b1518867a77ba21c0e0e7e4bb817cbd24106aec09a183d4c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Algorithms</topic><topic>Augmented reality</topic><topic>Blurred</topic><topic>Cameras</topic><topic>Computational efficiency</topic><topic>Computational modeling</topic><topic>computer vision</topic><topic>efficient second-order minimization</topic><topic>Jacobian matrices</topic><topic>motion-blur</topic><topic>object detection</topic><topic>object tracking</topic><topic>Rendering</topic><topic>Rendering (computer graphics)</topic><topic>Robustness</topic><topic>Studies</topic><topic>Three dimensional</topic><topic>Three dimensional displays</topic><topic>Tracking</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Youngmin Park</creatorcontrib><creatorcontrib>Lepetit, V.</creatorcontrib><creatorcontrib>Woontack Woo</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on visualization and computer graphics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Youngmin Park</au><au>Lepetit, V.</au><au>Woontack Woo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality</atitle><jtitle>IEEE transactions on visualization and computer graphics</jtitle><stitle>TVCG</stitle><addtitle>IEEE Trans Vis Comput Graph</addtitle><date>2012-09-01</date><risdate>2012</risdate><volume>18</volume><issue>9</issue><spage>1449</spage><epage>1459</epage><pages>1449-1459</pages><issn>1077-2626</issn><eissn>1941-0506</eissn><coden>ITVGEA</coden><abstract>The contribution of this paper is two-fold. First, we show how to extend the ESM algorithm to handle motion blur in 3D object tracking. ESM is a powerful algorithm for template matching-based tracking, but it can fail under motion blur. We introduce an image formation model that explicitly consider the possibility of blur, and shows its results in a generalization of the original ESM algorithm. This allows to converge faster, more accurately and more robustly even under large amount of blur. Our second contribution is an efficient method for rendering the virtual objects under the estimated motion blur. It renders two images of the object under 3D perspective, and warps them to create many intermediate images. By fusing these images we obtain a final image for the virtual objects blurred consistently with the captured image. Because warping is much faster than 3D rendering, we can create realistically blurred images at a very low computational cost.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>21931174</pmid><doi>10.1109/TVCG.2011.158</doi><tpages>11</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1077-2626
ispartof IEEE transactions on visualization and computer graphics, 2012-09, Vol.18 (9), p.1449-1459
issn 1077-2626
1941-0506
language eng
recordid cdi_proquest_miscellaneous_1038234143
source IEEE Electronic Library (IEL)
subjects Algorithms
Augmented reality
Blurred
Cameras
Computational efficiency
Computational modeling
computer vision
efficient second-order minimization
Jacobian matrices
motion-blur
object detection
object tracking
Rendering
Rendering (computer graphics)
Robustness
Studies
Three dimensional
Three dimensional displays
Tracking
title Handling Motion-Blur in 3D Tracking and Rendering for Augmented Reality
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T23%3A49%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Handling%20Motion-Blur%20in%203D%20Tracking%20and%20Rendering%20for%20Augmented%20Reality&rft.jtitle=IEEE%20transactions%20on%20visualization%20and%20computer%20graphics&rft.au=Youngmin%20Park&rft.date=2012-09-01&rft.volume=18&rft.issue=9&rft.spage=1449&rft.epage=1459&rft.pages=1449-1459&rft.issn=1077-2626&rft.eissn=1941-0506&rft.coden=ITVGEA&rft_id=info:doi/10.1109/TVCG.2011.158&rft_dat=%3Cproquest_RIE%3E2714515821%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1026851961&rft_id=info:pmid/21931174&rft_ieee_id=6025351&rfr_iscdi=true