RGB-D Inertial Odometry for a Resource-Restricted Robot in Dynamic Environments

Current simultaneous localization and mapping (SLAM) algorithms perform well in static environments but easily fail in dynamic environments. Recent works introduce deep learning-based semantic information to SLAM systems to reduce the influence of dynamic objects. However, it is still challenging to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE robotics and automation letters 2022-10, Vol.7 (4), p.9573-9580
Hauptverfasser: Liu, Jianheng, Li, Xuanfu, Liu, Yueqian, Chen, Haoyao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 9580
container_issue 4
container_start_page 9573
container_title IEEE robotics and automation letters
container_volume 7
creator Liu, Jianheng
Li, Xuanfu
Liu, Yueqian
Chen, Haoyao
description Current simultaneous localization and mapping (SLAM) algorithms perform well in static environments but easily fail in dynamic environments. Recent works introduce deep learning-based semantic information to SLAM systems to reduce the influence of dynamic objects. However, it is still challenging to apply a robust localization in dynamic environments for resource-restricted robots. This paper proposes a real-time RGB-D inertial odometry system for resource-restricted robots in dynamic environments named Dynamic-VINS. Three main threads run in parallel: object detection, feature tracking, and state optimization. The proposed Dynamic-VINS combines object detection and depth information for dynamic feature recognition and achieves performance comparable to semantic segmentation. Dynamic-VINS adopts grid-based feature detection and proposes a fast and efficient method to extract high-quality FAST feature points. IMU is applied to predict motion for feature tracking and moving consistency check. The proposed method is evaluated on both public datasets and real-world applications and shows competitive localization accuracy and robustness in dynamic environments. Yet, to the best of our knowledge, it is the best-performance real-time RGB-D inertial odometry for resource-restricted platforms in dynamic environments for now. The proposed system is open source at: https://github.com/HITSZ-NRSL/Dynamic-VINS.git
doi_str_mv 10.1109/LRA.2022.3191193
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_LRA_2022_3191193</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9830851</ieee_id><sourcerecordid>2695143254</sourcerecordid><originalsourceid>FETCH-LOGICAL-c291t-55e7654ba4e813d6633aaf4779b8d9a7aa2aecebd3e2511f0f9704dee25875b83</originalsourceid><addsrcrecordid>eNpNkEtrwzAQhEVpoSHNvdCLoGeneliWdUzzaiAQMO1ZyPYaFGIplZVC_n0VEkpPOwszu8OH0DMlU0qJettWsykjjE05VZQqfodGjEuZcVkU9__0I5oMw54QQgWTXIkR2lXr92yBNw5CtOaAd63vIYYz7nzABlcw-FNoIEsiBttEaHHlax-xdXhxdqa3DV66Hxu868HF4Qk9dOYwwOQ2x-hrtfycf2Tb3Xozn22zhikaMyFAFiKvTQ4l5W1RcG5Ml0up6rJVRhrDDDRQtxyYoLQjnZIkbyFtpRR1ycfo9Xr3GPz3KZXT-1TUpZeaFUrQnDORJxe5uprghyFAp4_B9iacNSX6Qk4ncvpCTt_IpcjLNWIB4M-uSk5KQfkvH51o1A</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2695143254</pqid></control><display><type>article</type><title>RGB-D Inertial Odometry for a Resource-Restricted Robot in Dynamic Environments</title><source>IEEE Electronic Library (IEL)</source><creator>Liu, Jianheng ; Li, Xuanfu ; Liu, Yueqian ; Chen, Haoyao</creator><creatorcontrib>Liu, Jianheng ; Li, Xuanfu ; Liu, Yueqian ; Chen, Haoyao</creatorcontrib><description>Current simultaneous localization and mapping (SLAM) algorithms perform well in static environments but easily fail in dynamic environments. Recent works introduce deep learning-based semantic information to SLAM systems to reduce the influence of dynamic objects. However, it is still challenging to apply a robust localization in dynamic environments for resource-restricted robots. This paper proposes a real-time RGB-D inertial odometry system for resource-restricted robots in dynamic environments named Dynamic-VINS. Three main threads run in parallel: object detection, feature tracking, and state optimization. The proposed Dynamic-VINS combines object detection and depth information for dynamic feature recognition and achieves performance comparable to semantic segmentation. Dynamic-VINS adopts grid-based feature detection and proposes a fast and efficient method to extract high-quality FAST feature points. IMU is applied to predict motion for feature tracking and moving consistency check. The proposed method is evaluated on both public datasets and real-world applications and shows competitive localization accuracy and robustness in dynamic environments. Yet, to the best of our knowledge, it is the best-performance real-time RGB-D inertial odometry for resource-restricted platforms in dynamic environments for now. The proposed system is open source at: https://github.com/HITSZ-NRSL/Dynamic-VINS.git</description><identifier>ISSN: 2377-3766</identifier><identifier>EISSN: 2377-3766</identifier><identifier>DOI: 10.1109/LRA.2022.3191193</identifier><identifier>CODEN: IRALC6</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Algorithms ; Dynamic scheduling ; Feature detection ; Feature extraction ; Feature recognition ; Localization ; Machine learning ; Object detection ; Object recognition ; Optimization ; Real time ; Robots ; Semantic segmentation ; Semantics ; Simultaneous localization and mapping ; Tracking ; Vehicle dynamics ; visual-inertial SLAM</subject><ispartof>IEEE robotics and automation letters, 2022-10, Vol.7 (4), p.9573-9580</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c291t-55e7654ba4e813d6633aaf4779b8d9a7aa2aecebd3e2511f0f9704dee25875b83</citedby><cites>FETCH-LOGICAL-c291t-55e7654ba4e813d6633aaf4779b8d9a7aa2aecebd3e2511f0f9704dee25875b83</cites><orcidid>0000-0002-3124-5559 ; 0000-0003-1652-9681 ; 0000-0002-1994-6408</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9830851$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9830851$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Liu, Jianheng</creatorcontrib><creatorcontrib>Li, Xuanfu</creatorcontrib><creatorcontrib>Liu, Yueqian</creatorcontrib><creatorcontrib>Chen, Haoyao</creatorcontrib><title>RGB-D Inertial Odometry for a Resource-Restricted Robot in Dynamic Environments</title><title>IEEE robotics and automation letters</title><addtitle>LRA</addtitle><description>Current simultaneous localization and mapping (SLAM) algorithms perform well in static environments but easily fail in dynamic environments. Recent works introduce deep learning-based semantic information to SLAM systems to reduce the influence of dynamic objects. However, it is still challenging to apply a robust localization in dynamic environments for resource-restricted robots. This paper proposes a real-time RGB-D inertial odometry system for resource-restricted robots in dynamic environments named Dynamic-VINS. Three main threads run in parallel: object detection, feature tracking, and state optimization. The proposed Dynamic-VINS combines object detection and depth information for dynamic feature recognition and achieves performance comparable to semantic segmentation. Dynamic-VINS adopts grid-based feature detection and proposes a fast and efficient method to extract high-quality FAST feature points. IMU is applied to predict motion for feature tracking and moving consistency check. The proposed method is evaluated on both public datasets and real-world applications and shows competitive localization accuracy and robustness in dynamic environments. Yet, to the best of our knowledge, it is the best-performance real-time RGB-D inertial odometry for resource-restricted platforms in dynamic environments for now. The proposed system is open source at: https://github.com/HITSZ-NRSL/Dynamic-VINS.git</description><subject>Algorithms</subject><subject>Dynamic scheduling</subject><subject>Feature detection</subject><subject>Feature extraction</subject><subject>Feature recognition</subject><subject>Localization</subject><subject>Machine learning</subject><subject>Object detection</subject><subject>Object recognition</subject><subject>Optimization</subject><subject>Real time</subject><subject>Robots</subject><subject>Semantic segmentation</subject><subject>Semantics</subject><subject>Simultaneous localization and mapping</subject><subject>Tracking</subject><subject>Vehicle dynamics</subject><subject>visual-inertial SLAM</subject><issn>2377-3766</issn><issn>2377-3766</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkEtrwzAQhEVpoSHNvdCLoGeneliWdUzzaiAQMO1ZyPYaFGIplZVC_n0VEkpPOwszu8OH0DMlU0qJettWsykjjE05VZQqfodGjEuZcVkU9__0I5oMw54QQgWTXIkR2lXr92yBNw5CtOaAd63vIYYz7nzABlcw-FNoIEsiBttEaHHlax-xdXhxdqa3DV66Hxu868HF4Qk9dOYwwOQ2x-hrtfycf2Tb3Xozn22zhikaMyFAFiKvTQ4l5W1RcG5Ml0up6rJVRhrDDDRQtxyYoLQjnZIkbyFtpRR1ycfo9Xr3GPz3KZXT-1TUpZeaFUrQnDORJxe5uprghyFAp4_B9iacNSX6Qk4ncvpCTt_IpcjLNWIB4M-uSk5KQfkvH51o1A</recordid><startdate>20221001</startdate><enddate>20221001</enddate><creator>Liu, Jianheng</creator><creator>Li, Xuanfu</creator><creator>Liu, Yueqian</creator><creator>Chen, Haoyao</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-3124-5559</orcidid><orcidid>https://orcid.org/0000-0003-1652-9681</orcidid><orcidid>https://orcid.org/0000-0002-1994-6408</orcidid></search><sort><creationdate>20221001</creationdate><title>RGB-D Inertial Odometry for a Resource-Restricted Robot in Dynamic Environments</title><author>Liu, Jianheng ; Li, Xuanfu ; Liu, Yueqian ; Chen, Haoyao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c291t-55e7654ba4e813d6633aaf4779b8d9a7aa2aecebd3e2511f0f9704dee25875b83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Dynamic scheduling</topic><topic>Feature detection</topic><topic>Feature extraction</topic><topic>Feature recognition</topic><topic>Localization</topic><topic>Machine learning</topic><topic>Object detection</topic><topic>Object recognition</topic><topic>Optimization</topic><topic>Real time</topic><topic>Robots</topic><topic>Semantic segmentation</topic><topic>Semantics</topic><topic>Simultaneous localization and mapping</topic><topic>Tracking</topic><topic>Vehicle dynamics</topic><topic>visual-inertial SLAM</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Liu, Jianheng</creatorcontrib><creatorcontrib>Li, Xuanfu</creatorcontrib><creatorcontrib>Liu, Yueqian</creatorcontrib><creatorcontrib>Chen, Haoyao</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE robotics and automation letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Liu, Jianheng</au><au>Li, Xuanfu</au><au>Liu, Yueqian</au><au>Chen, Haoyao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>RGB-D Inertial Odometry for a Resource-Restricted Robot in Dynamic Environments</atitle><jtitle>IEEE robotics and automation letters</jtitle><stitle>LRA</stitle><date>2022-10-01</date><risdate>2022</risdate><volume>7</volume><issue>4</issue><spage>9573</spage><epage>9580</epage><pages>9573-9580</pages><issn>2377-3766</issn><eissn>2377-3766</eissn><coden>IRALC6</coden><abstract>Current simultaneous localization and mapping (SLAM) algorithms perform well in static environments but easily fail in dynamic environments. Recent works introduce deep learning-based semantic information to SLAM systems to reduce the influence of dynamic objects. However, it is still challenging to apply a robust localization in dynamic environments for resource-restricted robots. This paper proposes a real-time RGB-D inertial odometry system for resource-restricted robots in dynamic environments named Dynamic-VINS. Three main threads run in parallel: object detection, feature tracking, and state optimization. The proposed Dynamic-VINS combines object detection and depth information for dynamic feature recognition and achieves performance comparable to semantic segmentation. Dynamic-VINS adopts grid-based feature detection and proposes a fast and efficient method to extract high-quality FAST feature points. IMU is applied to predict motion for feature tracking and moving consistency check. The proposed method is evaluated on both public datasets and real-world applications and shows competitive localization accuracy and robustness in dynamic environments. Yet, to the best of our knowledge, it is the best-performance real-time RGB-D inertial odometry for resource-restricted platforms in dynamic environments for now. The proposed system is open source at: https://github.com/HITSZ-NRSL/Dynamic-VINS.git</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/LRA.2022.3191193</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0002-3124-5559</orcidid><orcidid>https://orcid.org/0000-0003-1652-9681</orcidid><orcidid>https://orcid.org/0000-0002-1994-6408</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2377-3766
ispartof IEEE robotics and automation letters, 2022-10, Vol.7 (4), p.9573-9580
issn 2377-3766
2377-3766
language eng
recordid cdi_crossref_primary_10_1109_LRA_2022_3191193
source IEEE Electronic Library (IEL)
subjects Algorithms
Dynamic scheduling
Feature detection
Feature extraction
Feature recognition
Localization
Machine learning
Object detection
Object recognition
Optimization
Real time
Robots
Semantic segmentation
Semantics
Simultaneous localization and mapping
Tracking
Vehicle dynamics
visual-inertial SLAM
title RGB-D Inertial Odometry for a Resource-Restricted Robot in Dynamic Environments
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-22T09%3A03%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=RGB-D%20Inertial%20Odometry%20for%20a%20Resource-Restricted%20Robot%20in%20Dynamic%20Environments&rft.jtitle=IEEE%20robotics%20and%20automation%20letters&rft.au=Liu,%20Jianheng&rft.date=2022-10-01&rft.volume=7&rft.issue=4&rft.spage=9573&rft.epage=9580&rft.pages=9573-9580&rft.issn=2377-3766&rft.eissn=2377-3766&rft.coden=IRALC6&rft_id=info:doi/10.1109/LRA.2022.3191193&rft_dat=%3Cproquest_RIE%3E2695143254%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2695143254&rft_id=info:pmid/&rft_ieee_id=9830851&rfr_iscdi=true