Angular Tracking Consistency Guided Fast Feature Association for Visual-Inertial SLAM
Sparse feature based visual-inertial SLAM system shows great potential for accurate pose estimation in real-time, especially for low-cost devices. However, the feature correspondence outliers inevitably degrade the localization accuracy, or cause failures. Unlike the existing methods that eliminate...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on instrumentation and measurement 2024-01, Vol.73, p.1-1 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1 |
---|---|
container_issue | |
container_start_page | 1 |
container_title | IEEE transactions on instrumentation and measurement |
container_volume | 73 |
creator | Xie, Hongle Deng, Tianchen Wang, Jingchuan Chen, Weidong |
description | Sparse feature based visual-inertial SLAM system shows great potential for accurate pose estimation in real-time, especially for low-cost devices. However, the feature correspondence outliers inevitably degrade the localization accuracy, or cause failures. Unlike the existing methods that eliminate outliers by fitting a geometric model, which have high complexity and rely on model hypothesis, we present a general and efficient model-free scheme to address these challenges. In particular, we propose a novel uniform bipartite motion field (UBMF) to exactly measure the spatial transforms of sparse feature correspondences in consecutive frames. Moreover, a new recursive angular tracking consistency (RATC) guided fast feature association algorithm is designed, which can efficiently select correspondence and update UBMF simultaneously, and it also holds the linear computational complexity and theoretical performance guarantee. Furthermore, we develop a lightweight angular tracking consistency guided visual-inertial SLAM (ATVIS) system, which achieves better robustness and outperforms the state-of-the-art methods. Massive qualitative and quantitative validations are conducted by both public benchmarks and different real-world experiments, which extensively demonstrate the superiority of our method in both localization accuracy and computational efficiency. |
doi_str_mv | 10.1109/TIM.2023.3348902 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2915723008</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10379105</ieee_id><sourcerecordid>2915723008</sourcerecordid><originalsourceid>FETCH-LOGICAL-c245t-91a57d4c64b99e77f65b4d570509ed6731ad99c63f9f2004fc1e77ebd54f9b143</originalsourceid><addsrcrecordid>eNpNkL9PAjEcxRujiYjuDg5NnA-__U1HQgRJIA6Ca9PrtaR43mF7N_DfewQHp7d83nvJB6FHAhNCQL9sV5sJBcomjPGpBnqFRkQIVWgp6TUaAZBpobmQt-gu5wMAKMnVCO1mzb6vbcLbZN1XbPZ43jY55s437oSXfax8hRc2d3jhbdcnj2c5ty7aLrYNDm3CnzH3ti5WjU9dtDX-WM829-gm2Dr7h78co93idTt_K9bvy9V8ti4c5aIrNLFCVdxJXmrtlQpSlLwSCgRoX0nFiK20dpIFHSgAD44MlC8rwYMuCWdj9HzZPab2p_e5M4e2T81waagmQlEGMB0ouFAutTknH8wxxW-bToaAOcszgzxzlmf-5A2Vp0sleu__4UxpAoL9Aq3Baks</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2915723008</pqid></control><display><type>article</type><title>Angular Tracking Consistency Guided Fast Feature Association for Visual-Inertial SLAM</title><source>IEEE Electronic Library (IEL)</source><creator>Xie, Hongle ; Deng, Tianchen ; Wang, Jingchuan ; Chen, Weidong</creator><creatorcontrib>Xie, Hongle ; Deng, Tianchen ; Wang, Jingchuan ; Chen, Weidong</creatorcontrib><description>Sparse feature based visual-inertial SLAM system shows great potential for accurate pose estimation in real-time, especially for low-cost devices. However, the feature correspondence outliers inevitably degrade the localization accuracy, or cause failures. Unlike the existing methods that eliminate outliers by fitting a geometric model, which have high complexity and rely on model hypothesis, we present a general and efficient model-free scheme to address these challenges. In particular, we propose a novel uniform bipartite motion field (UBMF) to exactly measure the spatial transforms of sparse feature correspondences in consecutive frames. Moreover, a new recursive angular tracking consistency (RATC) guided fast feature association algorithm is designed, which can efficiently select correspondence and update UBMF simultaneously, and it also holds the linear computational complexity and theoretical performance guarantee. Furthermore, we develop a lightweight angular tracking consistency guided visual-inertial SLAM (ATVIS) system, which achieves better robustness and outperforms the state-of-the-art methods. Massive qualitative and quantitative validations are conducted by both public benchmarks and different real-world experiments, which extensively demonstrate the superiority of our method in both localization accuracy and computational efficiency.</description><identifier>ISSN: 0018-9456</identifier><identifier>EISSN: 1557-9662</identifier><identifier>DOI: 10.1109/TIM.2023.3348902</identifier><identifier>CODEN: IEIMAO</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Accuracy ; Algorithms ; Cameras ; Complexity ; Consistency ; feature association ; Inertial guidance ; Localization ; Location awareness ; outlier removal ; Outliers (statistics) ; Pose estimation ; Robot vision systems ; Robustness ; Simultaneous localization and mapping ; Solid modeling ; state estimation ; Tracking ; Visual-inertial SLAM</subject><ispartof>IEEE transactions on instrumentation and measurement, 2024-01, Vol.73, p.1-1</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c245t-91a57d4c64b99e77f65b4d570509ed6731ad99c63f9f2004fc1e77ebd54f9b143</cites><orcidid>0000-0002-4368-7936 ; 0000-0001-8757-0679 ; 0000-0001-8589-6798 ; 0000-0002-1943-1535</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10379105$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10379105$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Xie, Hongle</creatorcontrib><creatorcontrib>Deng, Tianchen</creatorcontrib><creatorcontrib>Wang, Jingchuan</creatorcontrib><creatorcontrib>Chen, Weidong</creatorcontrib><title>Angular Tracking Consistency Guided Fast Feature Association for Visual-Inertial SLAM</title><title>IEEE transactions on instrumentation and measurement</title><addtitle>TIM</addtitle><description>Sparse feature based visual-inertial SLAM system shows great potential for accurate pose estimation in real-time, especially for low-cost devices. However, the feature correspondence outliers inevitably degrade the localization accuracy, or cause failures. Unlike the existing methods that eliminate outliers by fitting a geometric model, which have high complexity and rely on model hypothesis, we present a general and efficient model-free scheme to address these challenges. In particular, we propose a novel uniform bipartite motion field (UBMF) to exactly measure the spatial transforms of sparse feature correspondences in consecutive frames. Moreover, a new recursive angular tracking consistency (RATC) guided fast feature association algorithm is designed, which can efficiently select correspondence and update UBMF simultaneously, and it also holds the linear computational complexity and theoretical performance guarantee. Furthermore, we develop a lightweight angular tracking consistency guided visual-inertial SLAM (ATVIS) system, which achieves better robustness and outperforms the state-of-the-art methods. Massive qualitative and quantitative validations are conducted by both public benchmarks and different real-world experiments, which extensively demonstrate the superiority of our method in both localization accuracy and computational efficiency.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Cameras</subject><subject>Complexity</subject><subject>Consistency</subject><subject>feature association</subject><subject>Inertial guidance</subject><subject>Localization</subject><subject>Location awareness</subject><subject>outlier removal</subject><subject>Outliers (statistics)</subject><subject>Pose estimation</subject><subject>Robot vision systems</subject><subject>Robustness</subject><subject>Simultaneous localization and mapping</subject><subject>Solid modeling</subject><subject>state estimation</subject><subject>Tracking</subject><subject>Visual-inertial SLAM</subject><issn>0018-9456</issn><issn>1557-9662</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkL9PAjEcxRujiYjuDg5NnA-__U1HQgRJIA6Ca9PrtaR43mF7N_DfewQHp7d83nvJB6FHAhNCQL9sV5sJBcomjPGpBnqFRkQIVWgp6TUaAZBpobmQt-gu5wMAKMnVCO1mzb6vbcLbZN1XbPZ43jY55s437oSXfax8hRc2d3jhbdcnj2c5ty7aLrYNDm3CnzH3ti5WjU9dtDX-WM829-gm2Dr7h78co93idTt_K9bvy9V8ti4c5aIrNLFCVdxJXmrtlQpSlLwSCgRoX0nFiK20dpIFHSgAD44MlC8rwYMuCWdj9HzZPab2p_e5M4e2T81waagmQlEGMB0ouFAutTknH8wxxW-bToaAOcszgzxzlmf-5A2Vp0sleu__4UxpAoL9Aq3Baks</recordid><startdate>20240101</startdate><enddate>20240101</enddate><creator>Xie, Hongle</creator><creator>Deng, Tianchen</creator><creator>Wang, Jingchuan</creator><creator>Chen, Weidong</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0002-4368-7936</orcidid><orcidid>https://orcid.org/0000-0001-8757-0679</orcidid><orcidid>https://orcid.org/0000-0001-8589-6798</orcidid><orcidid>https://orcid.org/0000-0002-1943-1535</orcidid></search><sort><creationdate>20240101</creationdate><title>Angular Tracking Consistency Guided Fast Feature Association for Visual-Inertial SLAM</title><author>Xie, Hongle ; Deng, Tianchen ; Wang, Jingchuan ; Chen, Weidong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c245t-91a57d4c64b99e77f65b4d570509ed6731ad99c63f9f2004fc1e77ebd54f9b143</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Cameras</topic><topic>Complexity</topic><topic>Consistency</topic><topic>feature association</topic><topic>Inertial guidance</topic><topic>Localization</topic><topic>Location awareness</topic><topic>outlier removal</topic><topic>Outliers (statistics)</topic><topic>Pose estimation</topic><topic>Robot vision systems</topic><topic>Robustness</topic><topic>Simultaneous localization and mapping</topic><topic>Solid modeling</topic><topic>state estimation</topic><topic>Tracking</topic><topic>Visual-inertial SLAM</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Xie, Hongle</creatorcontrib><creatorcontrib>Deng, Tianchen</creatorcontrib><creatorcontrib>Wang, Jingchuan</creatorcontrib><creatorcontrib>Chen, Weidong</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on instrumentation and measurement</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Xie, Hongle</au><au>Deng, Tianchen</au><au>Wang, Jingchuan</au><au>Chen, Weidong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Angular Tracking Consistency Guided Fast Feature Association for Visual-Inertial SLAM</atitle><jtitle>IEEE transactions on instrumentation and measurement</jtitle><stitle>TIM</stitle><date>2024-01-01</date><risdate>2024</risdate><volume>73</volume><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>0018-9456</issn><eissn>1557-9662</eissn><coden>IEIMAO</coden><abstract>Sparse feature based visual-inertial SLAM system shows great potential for accurate pose estimation in real-time, especially for low-cost devices. However, the feature correspondence outliers inevitably degrade the localization accuracy, or cause failures. Unlike the existing methods that eliminate outliers by fitting a geometric model, which have high complexity and rely on model hypothesis, we present a general and efficient model-free scheme to address these challenges. In particular, we propose a novel uniform bipartite motion field (UBMF) to exactly measure the spatial transforms of sparse feature correspondences in consecutive frames. Moreover, a new recursive angular tracking consistency (RATC) guided fast feature association algorithm is designed, which can efficiently select correspondence and update UBMF simultaneously, and it also holds the linear computational complexity and theoretical performance guarantee. Furthermore, we develop a lightweight angular tracking consistency guided visual-inertial SLAM (ATVIS) system, which achieves better robustness and outperforms the state-of-the-art methods. Massive qualitative and quantitative validations are conducted by both public benchmarks and different real-world experiments, which extensively demonstrate the superiority of our method in both localization accuracy and computational efficiency.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TIM.2023.3348902</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-4368-7936</orcidid><orcidid>https://orcid.org/0000-0001-8757-0679</orcidid><orcidid>https://orcid.org/0000-0001-8589-6798</orcidid><orcidid>https://orcid.org/0000-0002-1943-1535</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 0018-9456 |
ispartof | IEEE transactions on instrumentation and measurement, 2024-01, Vol.73, p.1-1 |
issn | 0018-9456 1557-9662 |
language | eng |
recordid | cdi_proquest_journals_2915723008 |
source | IEEE Electronic Library (IEL) |
subjects | Accuracy Algorithms Cameras Complexity Consistency feature association Inertial guidance Localization Location awareness outlier removal Outliers (statistics) Pose estimation Robot vision systems Robustness Simultaneous localization and mapping Solid modeling state estimation Tracking Visual-inertial SLAM |
title | Angular Tracking Consistency Guided Fast Feature Association for Visual-Inertial SLAM |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-13T03%3A43%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Angular%20Tracking%20Consistency%20Guided%20Fast%20Feature%20Association%20for%20Visual-Inertial%20SLAM&rft.jtitle=IEEE%20transactions%20on%20instrumentation%20and%20measurement&rft.au=Xie,%20Hongle&rft.date=2024-01-01&rft.volume=73&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=0018-9456&rft.eissn=1557-9662&rft.coden=IEIMAO&rft_id=info:doi/10.1109/TIM.2023.3348902&rft_dat=%3Cproquest_RIE%3E2915723008%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2915723008&rft_id=info:pmid/&rft_ieee_id=10379105&rfr_iscdi=true |