Accurate Dynamic SLAM Using CRF-Based Long-Term Consistency

Accurate camera pose estimation is essential and challenging for real world dynamic 3D reconstruction and augmented reality applications. In this article, we present a novel RGB-D SLAM approach for accurate camera pose tracking in dynamic environments. Previous methods detect dynamic components only...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics 2022-04, Vol.28 (4), p.1745-1757
Hauptverfasser: Du, Zheng-Jun, Huang, Shi-Sheng, Mu, Tai-Jiang, Zhao, Qunhe, Martin, Ralph R., Xu, Kun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1757
container_issue 4
container_start_page 1745
container_title IEEE transactions on visualization and computer graphics
container_volume 28
creator Du, Zheng-Jun
Huang, Shi-Sheng
Mu, Tai-Jiang
Zhao, Qunhe
Martin, Ralph R.
Xu, Kun
description Accurate camera pose estimation is essential and challenging for real world dynamic 3D reconstruction and augmented reality applications. In this article, we present a novel RGB-D SLAM approach for accurate camera pose tracking in dynamic environments. Previous methods detect dynamic components only across a short time-span of consecutive frames. Instead, we provide a more accurate dynamic 3D landmark detection method, followed by the use of long-term consistency via conditional random fields, which leverages long-term observations from multiple frames. Specifically, we first introduce an efficient initial camera pose estimation method based on distinguishing dynamic from static points using graph-cut RANSAC. These static/dynamic labels are used as priors for the unary potential in the conditional random fields, which further improves the accuracy of dynamic 3D landmark detection. Evaluation using the TUM and Bonn RGB-D dynamic datasets shows that our approach significantly outperforms state-of-the-art methods, providing much more accurate camera trajectory estimation in a variety of highly dynamic environments. We also show that dynamic 3D reconstruction can benefit from the camera poses estimated by our RGB-D SLAM approach.
doi_str_mv 10.1109/TVCG.2020.3028218
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9210559</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9210559</ieee_id><sourcerecordid>2633042519</sourcerecordid><originalsourceid>FETCH-LOGICAL-c349t-9cbdf1f394d4167e468ae4e68490bbcf5a5e181e01c2fe55bd9b202ca97f12ca3</originalsourceid><addsrcrecordid>eNpdkMFKw0AQhhdRbK0-gAgS8OIldWazSbN4qtFWoSJo63XZbCYlpUlqNjn07d3S2oOnGZhvhvk_xq4RhoggH-bfyXTIgcMwAB5zjE9YH6VAH0KITl0Po5HPIx712IW1KwAUIpbnrBcEro9B9Nnj2Jiu0S15z9tKl4Xxvmbjd29hi2rpJZ8T_0lbyrxZXS39OTWll9SVLWxLldlesrNcry1dHeqALSYv8-TVn31M35LxzDeBkK0vTZrlmAdSZAKjEYko1iQoioWENDV5qEPCGAnQ8JzCMM1k6kIZLUc5uhIM2P3-7qapfzqyrSoLa2i91hXVnVXcpRIIIAOH3v1DV3XXVO47xSOXWvAQpaNwT5mmtrahXG2aotTNViGonVm1M6t2ZtXBrNu5PVzu0pKy48afSgfc7IGCiI5jyRHCUAa_u3t58A</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2633042519</pqid></control><display><type>article</type><title>Accurate Dynamic SLAM Using CRF-Based Long-Term Consistency</title><source>IEEE Electronic Library (IEL)</source><creator>Du, Zheng-Jun ; Huang, Shi-Sheng ; Mu, Tai-Jiang ; Zhao, Qunhe ; Martin, Ralph R. ; Xu, Kun</creator><creatorcontrib>Du, Zheng-Jun ; Huang, Shi-Sheng ; Mu, Tai-Jiang ; Zhao, Qunhe ; Martin, Ralph R. ; Xu, Kun</creatorcontrib><description>Accurate camera pose estimation is essential and challenging for real world dynamic 3D reconstruction and augmented reality applications. In this article, we present a novel RGB-D SLAM approach for accurate camera pose tracking in dynamic environments. Previous methods detect dynamic components only across a short time-span of consecutive frames. Instead, we provide a more accurate dynamic 3D landmark detection method, followed by the use of long-term consistency via conditional random fields, which leverages long-term observations from multiple frames. Specifically, we first introduce an efficient initial camera pose estimation method based on distinguishing dynamic from static points using graph-cut RANSAC. These static/dynamic labels are used as priors for the unary potential in the conditional random fields, which further improves the accuracy of dynamic 3D landmark detection. Evaluation using the TUM and Bonn RGB-D dynamic datasets shows that our approach significantly outperforms state-of-the-art methods, providing much more accurate camera trajectory estimation in a variety of highly dynamic environments. We also show that dynamic 3D reconstruction can benefit from the camera poses estimated by our RGB-D SLAM approach.</description><identifier>ISSN: 1077-2626</identifier><identifier>EISSN: 1941-0506</identifier><identifier>DOI: 10.1109/TVCG.2020.3028218</identifier><identifier>PMID: 33001804</identifier><identifier>CODEN: ITVGEA</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Augmented reality ; Cameras ; Conditional random fields ; Consistency ; dynamic SLAM ; Dynamics ; graph-cut RANSAC ; long-term consistency ; Pose estimation ; Reconstruction ; RGB-D SLAM ; Robustness ; Simultaneous localization and mapping ; Three-dimensional displays ; Trajectory analysis ; Visualization</subject><ispartof>IEEE transactions on visualization and computer graphics, 2022-04, Vol.28 (4), p.1745-1757</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c349t-9cbdf1f394d4167e468ae4e68490bbcf5a5e181e01c2fe55bd9b202ca97f12ca3</citedby><cites>FETCH-LOGICAL-c349t-9cbdf1f394d4167e468ae4e68490bbcf5a5e181e01c2fe55bd9b202ca97f12ca3</cites><orcidid>0000-0002-2671-4170 ; 0000-0002-9197-346X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9210559$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9210559$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33001804$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Du, Zheng-Jun</creatorcontrib><creatorcontrib>Huang, Shi-Sheng</creatorcontrib><creatorcontrib>Mu, Tai-Jiang</creatorcontrib><creatorcontrib>Zhao, Qunhe</creatorcontrib><creatorcontrib>Martin, Ralph R.</creatorcontrib><creatorcontrib>Xu, Kun</creatorcontrib><title>Accurate Dynamic SLAM Using CRF-Based Long-Term Consistency</title><title>IEEE transactions on visualization and computer graphics</title><addtitle>TVCG</addtitle><addtitle>IEEE Trans Vis Comput Graph</addtitle><description>Accurate camera pose estimation is essential and challenging for real world dynamic 3D reconstruction and augmented reality applications. In this article, we present a novel RGB-D SLAM approach for accurate camera pose tracking in dynamic environments. Previous methods detect dynamic components only across a short time-span of consecutive frames. Instead, we provide a more accurate dynamic 3D landmark detection method, followed by the use of long-term consistency via conditional random fields, which leverages long-term observations from multiple frames. Specifically, we first introduce an efficient initial camera pose estimation method based on distinguishing dynamic from static points using graph-cut RANSAC. These static/dynamic labels are used as priors for the unary potential in the conditional random fields, which further improves the accuracy of dynamic 3D landmark detection. Evaluation using the TUM and Bonn RGB-D dynamic datasets shows that our approach significantly outperforms state-of-the-art methods, providing much more accurate camera trajectory estimation in a variety of highly dynamic environments. We also show that dynamic 3D reconstruction can benefit from the camera poses estimated by our RGB-D SLAM approach.</description><subject>Augmented reality</subject><subject>Cameras</subject><subject>Conditional random fields</subject><subject>Consistency</subject><subject>dynamic SLAM</subject><subject>Dynamics</subject><subject>graph-cut RANSAC</subject><subject>long-term consistency</subject><subject>Pose estimation</subject><subject>Reconstruction</subject><subject>RGB-D SLAM</subject><subject>Robustness</subject><subject>Simultaneous localization and mapping</subject><subject>Three-dimensional displays</subject><subject>Trajectory analysis</subject><subject>Visualization</subject><issn>1077-2626</issn><issn>1941-0506</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpdkMFKw0AQhhdRbK0-gAgS8OIldWazSbN4qtFWoSJo63XZbCYlpUlqNjn07d3S2oOnGZhvhvk_xq4RhoggH-bfyXTIgcMwAB5zjE9YH6VAH0KITl0Po5HPIx712IW1KwAUIpbnrBcEro9B9Nnj2Jiu0S15z9tKl4Xxvmbjd29hi2rpJZ8T_0lbyrxZXS39OTWll9SVLWxLldlesrNcry1dHeqALSYv8-TVn31M35LxzDeBkK0vTZrlmAdSZAKjEYko1iQoioWENDV5qEPCGAnQ8JzCMM1k6kIZLUc5uhIM2P3-7qapfzqyrSoLa2i91hXVnVXcpRIIIAOH3v1DV3XXVO47xSOXWvAQpaNwT5mmtrahXG2aotTNViGonVm1M6t2ZtXBrNu5PVzu0pKy48afSgfc7IGCiI5jyRHCUAa_u3t58A</recordid><startdate>20220401</startdate><enddate>20220401</enddate><creator>Du, Zheng-Jun</creator><creator>Huang, Shi-Sheng</creator><creator>Mu, Tai-Jiang</creator><creator>Zhao, Qunhe</creator><creator>Martin, Ralph R.</creator><creator>Xu, Kun</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-2671-4170</orcidid><orcidid>https://orcid.org/0000-0002-9197-346X</orcidid></search><sort><creationdate>20220401</creationdate><title>Accurate Dynamic SLAM Using CRF-Based Long-Term Consistency</title><author>Du, Zheng-Jun ; Huang, Shi-Sheng ; Mu, Tai-Jiang ; Zhao, Qunhe ; Martin, Ralph R. ; Xu, Kun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c349t-9cbdf1f394d4167e468ae4e68490bbcf5a5e181e01c2fe55bd9b202ca97f12ca3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Augmented reality</topic><topic>Cameras</topic><topic>Conditional random fields</topic><topic>Consistency</topic><topic>dynamic SLAM</topic><topic>Dynamics</topic><topic>graph-cut RANSAC</topic><topic>long-term consistency</topic><topic>Pose estimation</topic><topic>Reconstruction</topic><topic>RGB-D SLAM</topic><topic>Robustness</topic><topic>Simultaneous localization and mapping</topic><topic>Three-dimensional displays</topic><topic>Trajectory analysis</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Du, Zheng-Jun</creatorcontrib><creatorcontrib>Huang, Shi-Sheng</creatorcontrib><creatorcontrib>Mu, Tai-Jiang</creatorcontrib><creatorcontrib>Zhao, Qunhe</creatorcontrib><creatorcontrib>Martin, Ralph R.</creatorcontrib><creatorcontrib>Xu, Kun</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on visualization and computer graphics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Du, Zheng-Jun</au><au>Huang, Shi-Sheng</au><au>Mu, Tai-Jiang</au><au>Zhao, Qunhe</au><au>Martin, Ralph R.</au><au>Xu, Kun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Accurate Dynamic SLAM Using CRF-Based Long-Term Consistency</atitle><jtitle>IEEE transactions on visualization and computer graphics</jtitle><stitle>TVCG</stitle><addtitle>IEEE Trans Vis Comput Graph</addtitle><date>2022-04-01</date><risdate>2022</risdate><volume>28</volume><issue>4</issue><spage>1745</spage><epage>1757</epage><pages>1745-1757</pages><issn>1077-2626</issn><eissn>1941-0506</eissn><coden>ITVGEA</coden><abstract>Accurate camera pose estimation is essential and challenging for real world dynamic 3D reconstruction and augmented reality applications. In this article, we present a novel RGB-D SLAM approach for accurate camera pose tracking in dynamic environments. Previous methods detect dynamic components only across a short time-span of consecutive frames. Instead, we provide a more accurate dynamic 3D landmark detection method, followed by the use of long-term consistency via conditional random fields, which leverages long-term observations from multiple frames. Specifically, we first introduce an efficient initial camera pose estimation method based on distinguishing dynamic from static points using graph-cut RANSAC. These static/dynamic labels are used as priors for the unary potential in the conditional random fields, which further improves the accuracy of dynamic 3D landmark detection. Evaluation using the TUM and Bonn RGB-D dynamic datasets shows that our approach significantly outperforms state-of-the-art methods, providing much more accurate camera trajectory estimation in a variety of highly dynamic environments. We also show that dynamic 3D reconstruction can benefit from the camera poses estimated by our RGB-D SLAM approach.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>33001804</pmid><doi>10.1109/TVCG.2020.3028218</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-2671-4170</orcidid><orcidid>https://orcid.org/0000-0002-9197-346X</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1077-2626
ispartof IEEE transactions on visualization and computer graphics, 2022-04, Vol.28 (4), p.1745-1757
issn 1077-2626
1941-0506
language eng
recordid cdi_ieee_primary_9210559
source IEEE Electronic Library (IEL)
subjects Augmented reality
Cameras
Conditional random fields
Consistency
dynamic SLAM
Dynamics
graph-cut RANSAC
long-term consistency
Pose estimation
Reconstruction
RGB-D SLAM
Robustness
Simultaneous localization and mapping
Three-dimensional displays
Trajectory analysis
Visualization
title Accurate Dynamic SLAM Using CRF-Based Long-Term Consistency
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-14T20%3A43%3A15IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Accurate%20Dynamic%20SLAM%20Using%20CRF-Based%20Long-Term%20Consistency&rft.jtitle=IEEE%20transactions%20on%20visualization%20and%20computer%20graphics&rft.au=Du,%20Zheng-Jun&rft.date=2022-04-01&rft.volume=28&rft.issue=4&rft.spage=1745&rft.epage=1757&rft.pages=1745-1757&rft.issn=1077-2626&rft.eissn=1941-0506&rft.coden=ITVGEA&rft_id=info:doi/10.1109/TVCG.2020.3028218&rft_dat=%3Cproquest_RIE%3E2633042519%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2633042519&rft_id=info:pmid/33001804&rft_ieee_id=9210559&rfr_iscdi=true