Long-Term Visual Simultaneous Localization and Mapping: Using a Bayesian Persistence Filter-Based Global Map Prediction
With the rapidly growing demand for accurate localization in real-world environments, visual simultaneous localization and mapping (SLAM) has received significant attention in recent years. However, those existing methods still suffer from the degradation of localization accuracy in long-term changi...
Gespeichert in:
Veröffentlicht in: | IEEE robotics & automation magazine 2023-03, Vol.30 (1), p.2-15 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 15 |
---|---|
container_issue | 1 |
container_start_page | 2 |
container_title | IEEE robotics & automation magazine |
container_volume | 30 |
creator | Deng, Tianchen Xie, Hongle Wang, Jingchuan Chen, Weidong |
description | With the rapidly growing demand for accurate localization in real-world environments, visual simultaneous localization and mapping (SLAM) has received significant attention in recent years. However, those existing methods still suffer from the degradation of localization accuracy in long-term changing environments. To address these problems, we propose a novel long-term SLAM system with map prediction and dynamics removal. First, a visual point-cloud matching algorithm is designed to efficiently fuse 2D pixel information and 3D voxel information. Second, each map point is classified into three types: static, semistatic, and dynamic based on the Bayesian persistence filter (BPF). Then we remove the dynamic map points to eliminate the influence of those map points. We can obtain a global predicted map by modeling the time series of semistatic map points. Finally, we incorporate the predicted global map into a state-of-the-art SLAM method, achieving an efficient visual SLAM system for long-term, dynamic environments. Extensive experiments are carried out on a wheelchair robot in an indoor environment over several months. The results demonstrate that our method has better map prediction accuracy and achieves more robust localization performance. |
doi_str_mv | 10.1109/MRA.2022.3228492 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_10005610</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10005610</ieee_id><sourcerecordid>2792131667</sourcerecordid><originalsourceid>FETCH-LOGICAL-c292t-715a137d3995e74d2740c72dd9e498fd3a3ac8aef2ef22f4d82a11499a1a6ca3</originalsourceid><addsrcrecordid>eNpNkE1Lw0AQhoMoWKt3Dx4WPKfuzuZrvbViq5Bi0SrewpidlC1pUncTpP56t7QHYeCdw_sBTxBcCz4Sgqu7-et4BBxgJAGySMFJMBBxnIUA8vPU_zzloVISzoML59aciyiT2SD4ydtmFS7JbtiHcT3W7M1s-rrDhtresbwtsTa_2Jm2YdhoNsft1jSre_buvDBkE9yRM9iwBVlnXEdNSWxq6o5sOEFHms3q9sv3-iRbWNKm3JddBmcV1o6ujjoMltPH5cNTmL_Mnh_GeViCgi5MRYxCploqFVMaaUgjXqagtaJIZZWWKLHMkCrwB1WkM0AhIqVQYFKiHAa3h9qtbb97cl2xbnvb-MUCUgVCiiRJvYsfXKVtnbNUFVtrNmh3heDFnm7h6RZ7usWRro_cHCKGiP7ZOY8TweUfHix29A</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2792131667</pqid></control><display><type>article</type><title>Long-Term Visual Simultaneous Localization and Mapping: Using a Bayesian Persistence Filter-Based Global Map Prediction</title><source>IEEE Electronic Library (IEL)</source><creator>Deng, Tianchen ; Xie, Hongle ; Wang, Jingchuan ; Chen, Weidong</creator><creatorcontrib>Deng, Tianchen ; Xie, Hongle ; Wang, Jingchuan ; Chen, Weidong</creatorcontrib><description>With the rapidly growing demand for accurate localization in real-world environments, visual simultaneous localization and mapping (SLAM) has received significant attention in recent years. However, those existing methods still suffer from the degradation of localization accuracy in long-term changing environments. To address these problems, we propose a novel long-term SLAM system with map prediction and dynamics removal. First, a visual point-cloud matching algorithm is designed to efficiently fuse 2D pixel information and 3D voxel information. Second, each map point is classified into three types: static, semistatic, and dynamic based on the Bayesian persistence filter (BPF). Then we remove the dynamic map points to eliminate the influence of those map points. We can obtain a global predicted map by modeling the time series of semistatic map points. Finally, we incorporate the predicted global map into a state-of-the-art SLAM method, achieving an efficient visual SLAM system for long-term, dynamic environments. Extensive experiments are carried out on a wheelchair robot in an indoor environment over several months. The results demonstrate that our method has better map prediction accuracy and achieves more robust localization performance.</description><identifier>ISSN: 1070-9932</identifier><identifier>EISSN: 1558-223X</identifier><identifier>DOI: 10.1109/MRA.2022.3228492</identifier><identifier>CODEN: IRAMEB</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Algorithms ; Band-pass filters ; Bayesian analysis ; Changing environments ; Indoor environments ; Localization ; Location awareness ; Optical filters ; Robots ; Simultaneous localization and mapping ; Time series analysis ; Visualization ; Wheelchairs</subject><ispartof>IEEE robotics & automation magazine, 2023-03, Vol.30 (1), p.2-15</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c292t-715a137d3995e74d2740c72dd9e498fd3a3ac8aef2ef22f4d82a11499a1a6ca3</citedby><cites>FETCH-LOGICAL-c292t-715a137d3995e74d2740c72dd9e498fd3a3ac8aef2ef22f4d82a11499a1a6ca3</cites><orcidid>0000-0002-4368-7936 ; 0000-0002-1943-1535 ; 0000-0001-8589-6798 ; 0000-0001-8757-0679</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10005610$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10005610$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Deng, Tianchen</creatorcontrib><creatorcontrib>Xie, Hongle</creatorcontrib><creatorcontrib>Wang, Jingchuan</creatorcontrib><creatorcontrib>Chen, Weidong</creatorcontrib><title>Long-Term Visual Simultaneous Localization and Mapping: Using a Bayesian Persistence Filter-Based Global Map Prediction</title><title>IEEE robotics & automation magazine</title><addtitle>MRA</addtitle><description>With the rapidly growing demand for accurate localization in real-world environments, visual simultaneous localization and mapping (SLAM) has received significant attention in recent years. However, those existing methods still suffer from the degradation of localization accuracy in long-term changing environments. To address these problems, we propose a novel long-term SLAM system with map prediction and dynamics removal. First, a visual point-cloud matching algorithm is designed to efficiently fuse 2D pixel information and 3D voxel information. Second, each map point is classified into three types: static, semistatic, and dynamic based on the Bayesian persistence filter (BPF). Then we remove the dynamic map points to eliminate the influence of those map points. We can obtain a global predicted map by modeling the time series of semistatic map points. Finally, we incorporate the predicted global map into a state-of-the-art SLAM method, achieving an efficient visual SLAM system for long-term, dynamic environments. Extensive experiments are carried out on a wheelchair robot in an indoor environment over several months. The results demonstrate that our method has better map prediction accuracy and achieves more robust localization performance.</description><subject>Algorithms</subject><subject>Band-pass filters</subject><subject>Bayesian analysis</subject><subject>Changing environments</subject><subject>Indoor environments</subject><subject>Localization</subject><subject>Location awareness</subject><subject>Optical filters</subject><subject>Robots</subject><subject>Simultaneous localization and mapping</subject><subject>Time series analysis</subject><subject>Visualization</subject><subject>Wheelchairs</subject><issn>1070-9932</issn><issn>1558-223X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkE1Lw0AQhoMoWKt3Dx4WPKfuzuZrvbViq5Bi0SrewpidlC1pUncTpP56t7QHYeCdw_sBTxBcCz4Sgqu7-et4BBxgJAGySMFJMBBxnIUA8vPU_zzloVISzoML59aciyiT2SD4ydtmFS7JbtiHcT3W7M1s-rrDhtresbwtsTa_2Jm2YdhoNsft1jSre_buvDBkE9yRM9iwBVlnXEdNSWxq6o5sOEFHms3q9sv3-iRbWNKm3JddBmcV1o6ujjoMltPH5cNTmL_Mnh_GeViCgi5MRYxCploqFVMaaUgjXqagtaJIZZWWKLHMkCrwB1WkM0AhIqVQYFKiHAa3h9qtbb97cl2xbnvb-MUCUgVCiiRJvYsfXKVtnbNUFVtrNmh3heDFnm7h6RZ7usWRro_cHCKGiP7ZOY8TweUfHix29A</recordid><startdate>20230301</startdate><enddate>20230301</enddate><creator>Deng, Tianchen</creator><creator>Xie, Hongle</creator><creator>Wang, Jingchuan</creator><creator>Chen, Weidong</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0002-4368-7936</orcidid><orcidid>https://orcid.org/0000-0002-1943-1535</orcidid><orcidid>https://orcid.org/0000-0001-8589-6798</orcidid><orcidid>https://orcid.org/0000-0001-8757-0679</orcidid></search><sort><creationdate>20230301</creationdate><title>Long-Term Visual Simultaneous Localization and Mapping: Using a Bayesian Persistence Filter-Based Global Map Prediction</title><author>Deng, Tianchen ; Xie, Hongle ; Wang, Jingchuan ; Chen, Weidong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c292t-715a137d3995e74d2740c72dd9e498fd3a3ac8aef2ef22f4d82a11499a1a6ca3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Band-pass filters</topic><topic>Bayesian analysis</topic><topic>Changing environments</topic><topic>Indoor environments</topic><topic>Localization</topic><topic>Location awareness</topic><topic>Optical filters</topic><topic>Robots</topic><topic>Simultaneous localization and mapping</topic><topic>Time series analysis</topic><topic>Visualization</topic><topic>Wheelchairs</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Deng, Tianchen</creatorcontrib><creatorcontrib>Xie, Hongle</creatorcontrib><creatorcontrib>Wang, Jingchuan</creatorcontrib><creatorcontrib>Chen, Weidong</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><jtitle>IEEE robotics & automation magazine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Deng, Tianchen</au><au>Xie, Hongle</au><au>Wang, Jingchuan</au><au>Chen, Weidong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Long-Term Visual Simultaneous Localization and Mapping: Using a Bayesian Persistence Filter-Based Global Map Prediction</atitle><jtitle>IEEE robotics & automation magazine</jtitle><stitle>MRA</stitle><date>2023-03-01</date><risdate>2023</risdate><volume>30</volume><issue>1</issue><spage>2</spage><epage>15</epage><pages>2-15</pages><issn>1070-9932</issn><eissn>1558-223X</eissn><coden>IRAMEB</coden><abstract>With the rapidly growing demand for accurate localization in real-world environments, visual simultaneous localization and mapping (SLAM) has received significant attention in recent years. However, those existing methods still suffer from the degradation of localization accuracy in long-term changing environments. To address these problems, we propose a novel long-term SLAM system with map prediction and dynamics removal. First, a visual point-cloud matching algorithm is designed to efficiently fuse 2D pixel information and 3D voxel information. Second, each map point is classified into three types: static, semistatic, and dynamic based on the Bayesian persistence filter (BPF). Then we remove the dynamic map points to eliminate the influence of those map points. We can obtain a global predicted map by modeling the time series of semistatic map points. Finally, we incorporate the predicted global map into a state-of-the-art SLAM method, achieving an efficient visual SLAM system for long-term, dynamic environments. Extensive experiments are carried out on a wheelchair robot in an indoor environment over several months. The results demonstrate that our method has better map prediction accuracy and achieves more robust localization performance.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/MRA.2022.3228492</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0002-4368-7936</orcidid><orcidid>https://orcid.org/0000-0002-1943-1535</orcidid><orcidid>https://orcid.org/0000-0001-8589-6798</orcidid><orcidid>https://orcid.org/0000-0001-8757-0679</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1070-9932 |
ispartof | IEEE robotics & automation magazine, 2023-03, Vol.30 (1), p.2-15 |
issn | 1070-9932 1558-223X |
language | eng |
recordid | cdi_ieee_primary_10005610 |
source | IEEE Electronic Library (IEL) |
subjects | Algorithms Band-pass filters Bayesian analysis Changing environments Indoor environments Localization Location awareness Optical filters Robots Simultaneous localization and mapping Time series analysis Visualization Wheelchairs |
title | Long-Term Visual Simultaneous Localization and Mapping: Using a Bayesian Persistence Filter-Based Global Map Prediction |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T12%3A53%3A32IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Long-Term%20Visual%20Simultaneous%20Localization%20and%20Mapping:%20Using%20a%20Bayesian%20Persistence%20Filter-Based%20Global%20Map%20Prediction&rft.jtitle=IEEE%20robotics%20&%20automation%20magazine&rft.au=Deng,%20Tianchen&rft.date=2023-03-01&rft.volume=30&rft.issue=1&rft.spage=2&rft.epage=15&rft.pages=2-15&rft.issn=1070-9932&rft.eissn=1558-223X&rft.coden=IRAMEB&rft_id=info:doi/10.1109/MRA.2022.3228492&rft_dat=%3Cproquest_RIE%3E2792131667%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2792131667&rft_id=info:pmid/&rft_ieee_id=10005610&rfr_iscdi=true |