Visual Slam in Dynamic Scenes Based on Object Tracking and Static Points Detection
Simultaneously Localization and Mapping (SLAM) plays a key role in tasks such as mobile robots navigation and path planning. How to achieve high localization accuracy in various scenarios is particularly important. This paper proposes a visual Semantic SLAM algorithm based on object tracking and sta...
Gespeichert in:
Veröffentlicht in: | Journal of intelligent & robotic systems 2022-02, Vol.104 (2), Article 33 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | 2 |
container_start_page | |
container_title | Journal of intelligent & robotic systems |
container_volume | 104 |
creator | Li, Gui-Hai Chen, Song-Lin |
description | Simultaneously Localization and Mapping (SLAM) plays a key role in tasks such as mobile robots navigation and path planning. How to achieve high localization accuracy in various scenarios is particularly important. This paper proposes a visual Semantic SLAM algorithm based on object tracking and static points detection, in order to eliminate the influence of dynamic objects on localization and mapping. This algorithm is improved on the framework of ORB-SLAM2. For continuously acquired input images, tracking algorithm is combined with the object detection to achieve the inter-frame correlation of objects in the scene. Then, epipolar geometry is used to detect static points on each object, and depth constraint is introduced to improve robustness. After excluding dynamic objects, the static points are sent to the tracking thread to achieve more accurate localization result. Finally, we record the pose of the dynamic objects for robots autonomous navigation in the future. Experiments on the public datasets TUM and KITTI show that in dynamic scenes, the proposed algorithm has reduced the relative index of absolute trajectory error (ATE) by more than 90% compared with ORB-SLAM2. Our system is also superior than DynaSLAM and DS-SLAM in most cases, which proves that the proposed algorithm can effectively improve the localization accuracy of visual SLAM in dynamic scenes. |
doi_str_mv | 10.1007/s10846-021-01563-3 |
format | Article |
fullrecord | <record><control><sourceid>gale_proqu</sourceid><recordid>TN_cdi_proquest_journals_2626518445</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A729524032</galeid><sourcerecordid>A729524032</sourcerecordid><originalsourceid>FETCH-LOGICAL-c358t-dad8924eb63d56ef38dcea0687011ab078da3dcf3a1cea0f49934df7d78351573</originalsourceid><addsrcrecordid>eNp9kE9PwyAYh4nRxDn9Ap5IPHe-QKFwnJv_EpMZN70SVujC7Ogs3WHfXmpNvBkOJC_PA_x-CF0TmBCA4jYSkLnIgJIMCBcsYydoRHjBMshBnaIRqP6IKnGOLmLcAoCSXI3Q24ePB1PjZW122Ac8Pwaz8yVeli64iO9MdBY3AS_WW1d2eNWa8tOHDTbB4mVnuoS-Nj50Ec9dlwjfhEt0Vpk6uqvffYzeH-5Xs6fsZfH4PJu-ZCXjssussVLR3K0Fs1y4iklbOgNCFkCIWUMhrWG2rJgh_bzKlWK5rQpbSMb7aGN0M9y7b5uvg4ud3jaHNqQnNRVUcCLznCdqMlAbUzvtQ9V0KUNa1qWcTXCVT_NpQRWnOTCaBDoIZdvE2LpK71u_M-1RE9B92XooW6ey9U_ZmiWJDVJMcNi49u8v_1jf7vOAyQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2626518445</pqid></control><display><type>article</type><title>Visual Slam in Dynamic Scenes Based on Object Tracking and Static Points Detection</title><source>Springer Nature - Complete Springer Journals</source><creator>Li, Gui-Hai ; Chen, Song-Lin</creator><creatorcontrib>Li, Gui-Hai ; Chen, Song-Lin</creatorcontrib><description>Simultaneously Localization and Mapping (SLAM) plays a key role in tasks such as mobile robots navigation and path planning. How to achieve high localization accuracy in various scenarios is particularly important. This paper proposes a visual Semantic SLAM algorithm based on object tracking and static points detection, in order to eliminate the influence of dynamic objects on localization and mapping. This algorithm is improved on the framework of ORB-SLAM2. For continuously acquired input images, tracking algorithm is combined with the object detection to achieve the inter-frame correlation of objects in the scene. Then, epipolar geometry is used to detect static points on each object, and depth constraint is introduced to improve robustness. After excluding dynamic objects, the static points are sent to the tracking thread to achieve more accurate localization result. Finally, we record the pose of the dynamic objects for robots autonomous navigation in the future. Experiments on the public datasets TUM and KITTI show that in dynamic scenes, the proposed algorithm has reduced the relative index of absolute trajectory error (ATE) by more than 90% compared with ORB-SLAM2. Our system is also superior than DynaSLAM and DS-SLAM in most cases, which proves that the proposed algorithm can effectively improve the localization accuracy of visual SLAM in dynamic scenes.</description><identifier>ISSN: 0921-0296</identifier><identifier>EISSN: 1573-0409</identifier><identifier>DOI: 10.1007/s10846-021-01563-3</identifier><language>eng</language><publisher>Dordrecht: Springer Netherlands</publisher><subject>Algorithms ; Artificial Intelligence ; Autonomous navigation ; Control ; Electrical Engineering ; Engineering ; Image acquisition ; Localization ; Mechanical Engineering ; Mechatronics ; Object recognition ; Robotics ; Robots ; Short Paper ; Simultaneous localization and mapping ; Tracking ; Trajectory planning</subject><ispartof>Journal of intelligent & robotic systems, 2022-02, Vol.104 (2), Article 33</ispartof><rights>The Author(s), under exclusive licence to Springer Nature B.V. 2022</rights><rights>COPYRIGHT 2022 Springer</rights><rights>The Author(s), under exclusive licence to Springer Nature B.V. 2022.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c358t-dad8924eb63d56ef38dcea0687011ab078da3dcf3a1cea0f49934df7d78351573</citedby><cites>FETCH-LOGICAL-c358t-dad8924eb63d56ef38dcea0687011ab078da3dcf3a1cea0f49934df7d78351573</cites><orcidid>0000-0001-7129-5684</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10846-021-01563-3$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10846-021-01563-3$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,777,781,27905,27906,41469,42538,51300</link.rule.ids></links><search><creatorcontrib>Li, Gui-Hai</creatorcontrib><creatorcontrib>Chen, Song-Lin</creatorcontrib><title>Visual Slam in Dynamic Scenes Based on Object Tracking and Static Points Detection</title><title>Journal of intelligent & robotic systems</title><addtitle>J Intell Robot Syst</addtitle><description>Simultaneously Localization and Mapping (SLAM) plays a key role in tasks such as mobile robots navigation and path planning. How to achieve high localization accuracy in various scenarios is particularly important. This paper proposes a visual Semantic SLAM algorithm based on object tracking and static points detection, in order to eliminate the influence of dynamic objects on localization and mapping. This algorithm is improved on the framework of ORB-SLAM2. For continuously acquired input images, tracking algorithm is combined with the object detection to achieve the inter-frame correlation of objects in the scene. Then, epipolar geometry is used to detect static points on each object, and depth constraint is introduced to improve robustness. After excluding dynamic objects, the static points are sent to the tracking thread to achieve more accurate localization result. Finally, we record the pose of the dynamic objects for robots autonomous navigation in the future. Experiments on the public datasets TUM and KITTI show that in dynamic scenes, the proposed algorithm has reduced the relative index of absolute trajectory error (ATE) by more than 90% compared with ORB-SLAM2. Our system is also superior than DynaSLAM and DS-SLAM in most cases, which proves that the proposed algorithm can effectively improve the localization accuracy of visual SLAM in dynamic scenes.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Autonomous navigation</subject><subject>Control</subject><subject>Electrical Engineering</subject><subject>Engineering</subject><subject>Image acquisition</subject><subject>Localization</subject><subject>Mechanical Engineering</subject><subject>Mechatronics</subject><subject>Object recognition</subject><subject>Robotics</subject><subject>Robots</subject><subject>Short Paper</subject><subject>Simultaneous localization and mapping</subject><subject>Tracking</subject><subject>Trajectory planning</subject><issn>0921-0296</issn><issn>1573-0409</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kE9PwyAYh4nRxDn9Ap5IPHe-QKFwnJv_EpMZN70SVujC7Ogs3WHfXmpNvBkOJC_PA_x-CF0TmBCA4jYSkLnIgJIMCBcsYydoRHjBMshBnaIRqP6IKnGOLmLcAoCSXI3Q24ePB1PjZW122Ac8Pwaz8yVeli64iO9MdBY3AS_WW1d2eNWa8tOHDTbB4mVnuoS-Nj50Ec9dlwjfhEt0Vpk6uqvffYzeH-5Xs6fsZfH4PJu-ZCXjssussVLR3K0Fs1y4iklbOgNCFkCIWUMhrWG2rJgh_bzKlWK5rQpbSMb7aGN0M9y7b5uvg4ud3jaHNqQnNRVUcCLznCdqMlAbUzvtQ9V0KUNa1qWcTXCVT_NpQRWnOTCaBDoIZdvE2LpK71u_M-1RE9B92XooW6ey9U_ZmiWJDVJMcNi49u8v_1jf7vOAyQ</recordid><startdate>20220201</startdate><enddate>20220201</enddate><creator>Li, Gui-Hai</creator><creator>Chen, Song-Lin</creator><general>Springer Netherlands</general><general>Springer</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>7XB</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0001-7129-5684</orcidid></search><sort><creationdate>20220201</creationdate><title>Visual Slam in Dynamic Scenes Based on Object Tracking and Static Points Detection</title><author>Li, Gui-Hai ; Chen, Song-Lin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c358t-dad8924eb63d56ef38dcea0687011ab078da3dcf3a1cea0f49934df7d78351573</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Autonomous navigation</topic><topic>Control</topic><topic>Electrical Engineering</topic><topic>Engineering</topic><topic>Image acquisition</topic><topic>Localization</topic><topic>Mechanical Engineering</topic><topic>Mechatronics</topic><topic>Object recognition</topic><topic>Robotics</topic><topic>Robots</topic><topic>Short Paper</topic><topic>Simultaneous localization and mapping</topic><topic>Tracking</topic><topic>Trajectory planning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Gui-Hai</creatorcontrib><creatorcontrib>Chen, Song-Lin</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Engineering Research Database</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer science database</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>ProQuest advanced technologies & aerospace journals</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection><collection>ProQuest Central Basic</collection><jtitle>Journal of intelligent & robotic systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Gui-Hai</au><au>Chen, Song-Lin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Visual Slam in Dynamic Scenes Based on Object Tracking and Static Points Detection</atitle><jtitle>Journal of intelligent & robotic systems</jtitle><stitle>J Intell Robot Syst</stitle><date>2022-02-01</date><risdate>2022</risdate><volume>104</volume><issue>2</issue><artnum>33</artnum><issn>0921-0296</issn><eissn>1573-0409</eissn><abstract>Simultaneously Localization and Mapping (SLAM) plays a key role in tasks such as mobile robots navigation and path planning. How to achieve high localization accuracy in various scenarios is particularly important. This paper proposes a visual Semantic SLAM algorithm based on object tracking and static points detection, in order to eliminate the influence of dynamic objects on localization and mapping. This algorithm is improved on the framework of ORB-SLAM2. For continuously acquired input images, tracking algorithm is combined with the object detection to achieve the inter-frame correlation of objects in the scene. Then, epipolar geometry is used to detect static points on each object, and depth constraint is introduced to improve robustness. After excluding dynamic objects, the static points are sent to the tracking thread to achieve more accurate localization result. Finally, we record the pose of the dynamic objects for robots autonomous navigation in the future. Experiments on the public datasets TUM and KITTI show that in dynamic scenes, the proposed algorithm has reduced the relative index of absolute trajectory error (ATE) by more than 90% compared with ORB-SLAM2. Our system is also superior than DynaSLAM and DS-SLAM in most cases, which proves that the proposed algorithm can effectively improve the localization accuracy of visual SLAM in dynamic scenes.</abstract><cop>Dordrecht</cop><pub>Springer Netherlands</pub><doi>10.1007/s10846-021-01563-3</doi><orcidid>https://orcid.org/0000-0001-7129-5684</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0921-0296 |
ispartof | Journal of intelligent & robotic systems, 2022-02, Vol.104 (2), Article 33 |
issn | 0921-0296 1573-0409 |
language | eng |
recordid | cdi_proquest_journals_2626518445 |
source | Springer Nature - Complete Springer Journals |
subjects | Algorithms Artificial Intelligence Autonomous navigation Control Electrical Engineering Engineering Image acquisition Localization Mechanical Engineering Mechatronics Object recognition Robotics Robots Short Paper Simultaneous localization and mapping Tracking Trajectory planning |
title | Visual Slam in Dynamic Scenes Based on Object Tracking and Static Points Detection |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T13%3A46%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Visual%20Slam%20in%20Dynamic%20Scenes%20Based%20on%20Object%20Tracking%20and%20Static%20Points%20Detection&rft.jtitle=Journal%20of%20intelligent%20&%20robotic%20systems&rft.au=Li,%20Gui-Hai&rft.date=2022-02-01&rft.volume=104&rft.issue=2&rft.artnum=33&rft.issn=0921-0296&rft.eissn=1573-0409&rft_id=info:doi/10.1007/s10846-021-01563-3&rft_dat=%3Cgale_proqu%3EA729524032%3C/gale_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2626518445&rft_id=info:pmid/&rft_galeid=A729524032&rfr_iscdi=true |