LoLa-SLAM: Low-Latency LiDAR SLAM Using Continuous Scan Slicing
Real-time 6D pose estimation is a key component for autonomous indoor navigation of Unmanned Aerial Vehicles (UAVs). This letter presents a low-latency LiDAR SLAM framework based on LiDAR scan slicing and concurrent matching, called LoLa-SLAM. Our framework uses sliced point cloud data from a rotati...
Gespeichert in:
Veröffentlicht in: | IEEE robotics and automation letters 2021-04, Vol.6 (2), p.2248-2255 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 2255 |
---|---|
container_issue | 2 |
container_start_page | 2248 |
container_title | IEEE robotics and automation letters |
container_volume | 6 |
creator | Karimi, Mojtaba Oelsch, Martin Stengel, Oliver Babaians, Edwin Steinbach, Eckehard |
description | Real-time 6D pose estimation is a key component for autonomous indoor navigation of Unmanned Aerial Vehicles (UAVs). This letter presents a low-latency LiDAR SLAM framework based on LiDAR scan slicing and concurrent matching, called LoLa-SLAM. Our framework uses sliced point cloud data from a rotating LiDAR in a concurrent multi-threaded matching pipeline for 6D pose estimation with high update rate and low latency. The LiDAR is actuated using a 2D Lissajous spinning pattern to overcome the sensor's limited FoV. We propose a two-dimensional roughness model to extract the feature points for fine matching and registration of the point cloud. In addition, the pose estimator engages a temporal motion predictor that assists in finding the feature correspondences in the map for the fast convergence of the non-linear optimizer. Subsequently, an Extended Kalman Filter (EKF) is adopted for final pose fusion. The framework is evaluated in multiple experiments by comparing the accuracy, latency, and the update rate of the pose estimation for the trajectories flown in an indoor environment. We quantify the superior quality of the generated volumetric map in comparison to the state-of-the-art frameworks. We further examine the localization precision using ground truth pose information recorded by a total station unit. |
doi_str_mv | 10.1109/LRA.2021.3060721 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_LRA_2021_3060721</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9359468</ieee_id><sourcerecordid>2501946157</sourcerecordid><originalsourceid>FETCH-LOGICAL-c291t-2002462a6dc604f5a7fea1b82d963ae85f3fa89886adb3dc2dedd583611dc2253</originalsourceid><addsrcrecordid>eNpNkM1LAzEQxYMoWLR3wUvA89ZJ0nx5kaV-QkRo7Tmkm6xsqZu62UX635vSIp5m5vHeG_ghdEVgQgjoWzMvJxQomTAQICk5QSPKpCyYFOL0336OximtAYBwKpnmI3RvonHFwpRvd9jEn8K4PrTVDpvmoZzjvY6XqWk_8Sy2fdMOcUh4UbkWLzZNlfVLdFa7TQrj47xAy6fHj9lLYd6fX2elKSqqSV9QADoV1AlfCZjW3Mk6OLJS1GvBXFC8ZrVTWinh_Ir5ivrgPVdMEJIPytkFujn0brv4PYTU23Ucuja_tJQD0VNBuMwuOLiqLqbUhdpuu-bLdTtLwO5J2UzK7knZI6kcuT5EmhDCn10znisV-wXJj2Ei</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2501946157</pqid></control><display><type>article</type><title>LoLa-SLAM: Low-Latency LiDAR SLAM Using Continuous Scan Slicing</title><source>IEEE Electronic Library (IEL)</source><creator>Karimi, Mojtaba ; Oelsch, Martin ; Stengel, Oliver ; Babaians, Edwin ; Steinbach, Eckehard</creator><creatorcontrib>Karimi, Mojtaba ; Oelsch, Martin ; Stengel, Oliver ; Babaians, Edwin ; Steinbach, Eckehard</creatorcontrib><description>Real-time 6D pose estimation is a key component for autonomous indoor navigation of Unmanned Aerial Vehicles (UAVs). This letter presents a low-latency LiDAR SLAM framework based on LiDAR scan slicing and concurrent matching, called LoLa-SLAM. Our framework uses sliced point cloud data from a rotating LiDAR in a concurrent multi-threaded matching pipeline for 6D pose estimation with high update rate and low latency. The LiDAR is actuated using a 2D Lissajous spinning pattern to overcome the sensor's limited FoV. We propose a two-dimensional roughness model to extract the feature points for fine matching and registration of the point cloud. In addition, the pose estimator engages a temporal motion predictor that assists in finding the feature correspondences in the map for the fast convergence of the non-linear optimizer. Subsequently, an Extended Kalman Filter (EKF) is adopted for final pose fusion. The framework is evaluated in multiple experiments by comparing the accuracy, latency, and the update rate of the pose estimation for the trajectories flown in an indoor environment. We quantify the superior quality of the generated volumetric map in comparison to the state-of-the-art frameworks. We further examine the localization precision using ground truth pose information recorded by a total station unit.</description><identifier>ISSN: 2377-3766</identifier><identifier>EISSN: 2377-3766</identifier><identifier>DOI: 10.1109/LRA.2021.3060721</identifier><identifier>CODEN: IRALC6</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>aerial systems ; Autonomous navigation ; Extended Kalman filter ; Feature extraction ; Ground stations ; Ground truth ; Indoor environments ; Laser radar ; Lidar ; Location awareness ; low-latency localization ; Matching ; Measurement by laser beam ; perception and autonomy ; Pose estimation ; Real-time systems ; Simultaneous localization and mapping ; SLAM ; Slicing ; Three dimensional models ; Three-dimensional displays ; Trajectory analysis ; Two dimensional models ; Unmanned aerial vehicles</subject><ispartof>IEEE robotics and automation letters, 2021-04, Vol.6 (2), p.2248-2255</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c291t-2002462a6dc604f5a7fea1b82d963ae85f3fa89886adb3dc2dedd583611dc2253</citedby><cites>FETCH-LOGICAL-c291t-2002462a6dc604f5a7fea1b82d963ae85f3fa89886adb3dc2dedd583611dc2253</cites><orcidid>0000-0002-1516-9744 ; 0000-0001-8853-2703 ; 0000-0003-4993-3461 ; 0000-0003-1358-4431</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9359468$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9359468$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Karimi, Mojtaba</creatorcontrib><creatorcontrib>Oelsch, Martin</creatorcontrib><creatorcontrib>Stengel, Oliver</creatorcontrib><creatorcontrib>Babaians, Edwin</creatorcontrib><creatorcontrib>Steinbach, Eckehard</creatorcontrib><title>LoLa-SLAM: Low-Latency LiDAR SLAM Using Continuous Scan Slicing</title><title>IEEE robotics and automation letters</title><addtitle>LRA</addtitle><description>Real-time 6D pose estimation is a key component for autonomous indoor navigation of Unmanned Aerial Vehicles (UAVs). This letter presents a low-latency LiDAR SLAM framework based on LiDAR scan slicing and concurrent matching, called LoLa-SLAM. Our framework uses sliced point cloud data from a rotating LiDAR in a concurrent multi-threaded matching pipeline for 6D pose estimation with high update rate and low latency. The LiDAR is actuated using a 2D Lissajous spinning pattern to overcome the sensor's limited FoV. We propose a two-dimensional roughness model to extract the feature points for fine matching and registration of the point cloud. In addition, the pose estimator engages a temporal motion predictor that assists in finding the feature correspondences in the map for the fast convergence of the non-linear optimizer. Subsequently, an Extended Kalman Filter (EKF) is adopted for final pose fusion. The framework is evaluated in multiple experiments by comparing the accuracy, latency, and the update rate of the pose estimation for the trajectories flown in an indoor environment. We quantify the superior quality of the generated volumetric map in comparison to the state-of-the-art frameworks. We further examine the localization precision using ground truth pose information recorded by a total station unit.</description><subject>aerial systems</subject><subject>Autonomous navigation</subject><subject>Extended Kalman filter</subject><subject>Feature extraction</subject><subject>Ground stations</subject><subject>Ground truth</subject><subject>Indoor environments</subject><subject>Laser radar</subject><subject>Lidar</subject><subject>Location awareness</subject><subject>low-latency localization</subject><subject>Matching</subject><subject>Measurement by laser beam</subject><subject>perception and autonomy</subject><subject>Pose estimation</subject><subject>Real-time systems</subject><subject>Simultaneous localization and mapping</subject><subject>SLAM</subject><subject>Slicing</subject><subject>Three dimensional models</subject><subject>Three-dimensional displays</subject><subject>Trajectory analysis</subject><subject>Two dimensional models</subject><subject>Unmanned aerial vehicles</subject><issn>2377-3766</issn><issn>2377-3766</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkM1LAzEQxYMoWLR3wUvA89ZJ0nx5kaV-QkRo7Tmkm6xsqZu62UX635vSIp5m5vHeG_ghdEVgQgjoWzMvJxQomTAQICk5QSPKpCyYFOL0336OximtAYBwKpnmI3RvonHFwpRvd9jEn8K4PrTVDpvmoZzjvY6XqWk_8Sy2fdMOcUh4UbkWLzZNlfVLdFa7TQrj47xAy6fHj9lLYd6fX2elKSqqSV9QADoV1AlfCZjW3Mk6OLJS1GvBXFC8ZrVTWinh_Ir5ivrgPVdMEJIPytkFujn0brv4PYTU23Ucuja_tJQD0VNBuMwuOLiqLqbUhdpuu-bLdTtLwO5J2UzK7knZI6kcuT5EmhDCn10znisV-wXJj2Ei</recordid><startdate>20210401</startdate><enddate>20210401</enddate><creator>Karimi, Mojtaba</creator><creator>Oelsch, Martin</creator><creator>Stengel, Oliver</creator><creator>Babaians, Edwin</creator><creator>Steinbach, Eckehard</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-1516-9744</orcidid><orcidid>https://orcid.org/0000-0001-8853-2703</orcidid><orcidid>https://orcid.org/0000-0003-4993-3461</orcidid><orcidid>https://orcid.org/0000-0003-1358-4431</orcidid></search><sort><creationdate>20210401</creationdate><title>LoLa-SLAM: Low-Latency LiDAR SLAM Using Continuous Scan Slicing</title><author>Karimi, Mojtaba ; Oelsch, Martin ; Stengel, Oliver ; Babaians, Edwin ; Steinbach, Eckehard</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c291t-2002462a6dc604f5a7fea1b82d963ae85f3fa89886adb3dc2dedd583611dc2253</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>aerial systems</topic><topic>Autonomous navigation</topic><topic>Extended Kalman filter</topic><topic>Feature extraction</topic><topic>Ground stations</topic><topic>Ground truth</topic><topic>Indoor environments</topic><topic>Laser radar</topic><topic>Lidar</topic><topic>Location awareness</topic><topic>low-latency localization</topic><topic>Matching</topic><topic>Measurement by laser beam</topic><topic>perception and autonomy</topic><topic>Pose estimation</topic><topic>Real-time systems</topic><topic>Simultaneous localization and mapping</topic><topic>SLAM</topic><topic>Slicing</topic><topic>Three dimensional models</topic><topic>Three-dimensional displays</topic><topic>Trajectory analysis</topic><topic>Two dimensional models</topic><topic>Unmanned aerial vehicles</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Karimi, Mojtaba</creatorcontrib><creatorcontrib>Oelsch, Martin</creatorcontrib><creatorcontrib>Stengel, Oliver</creatorcontrib><creatorcontrib>Babaians, Edwin</creatorcontrib><creatorcontrib>Steinbach, Eckehard</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE robotics and automation letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Karimi, Mojtaba</au><au>Oelsch, Martin</au><au>Stengel, Oliver</au><au>Babaians, Edwin</au><au>Steinbach, Eckehard</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>LoLa-SLAM: Low-Latency LiDAR SLAM Using Continuous Scan Slicing</atitle><jtitle>IEEE robotics and automation letters</jtitle><stitle>LRA</stitle><date>2021-04-01</date><risdate>2021</risdate><volume>6</volume><issue>2</issue><spage>2248</spage><epage>2255</epage><pages>2248-2255</pages><issn>2377-3766</issn><eissn>2377-3766</eissn><coden>IRALC6</coden><abstract>Real-time 6D pose estimation is a key component for autonomous indoor navigation of Unmanned Aerial Vehicles (UAVs). This letter presents a low-latency LiDAR SLAM framework based on LiDAR scan slicing and concurrent matching, called LoLa-SLAM. Our framework uses sliced point cloud data from a rotating LiDAR in a concurrent multi-threaded matching pipeline for 6D pose estimation with high update rate and low latency. The LiDAR is actuated using a 2D Lissajous spinning pattern to overcome the sensor's limited FoV. We propose a two-dimensional roughness model to extract the feature points for fine matching and registration of the point cloud. In addition, the pose estimator engages a temporal motion predictor that assists in finding the feature correspondences in the map for the fast convergence of the non-linear optimizer. Subsequently, an Extended Kalman Filter (EKF) is adopted for final pose fusion. The framework is evaluated in multiple experiments by comparing the accuracy, latency, and the update rate of the pose estimation for the trajectories flown in an indoor environment. We quantify the superior quality of the generated volumetric map in comparison to the state-of-the-art frameworks. We further examine the localization precision using ground truth pose information recorded by a total station unit.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/LRA.2021.3060721</doi><tpages>8</tpages><orcidid>https://orcid.org/0000-0002-1516-9744</orcidid><orcidid>https://orcid.org/0000-0001-8853-2703</orcidid><orcidid>https://orcid.org/0000-0003-4993-3461</orcidid><orcidid>https://orcid.org/0000-0003-1358-4431</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2377-3766 |
ispartof | IEEE robotics and automation letters, 2021-04, Vol.6 (2), p.2248-2255 |
issn | 2377-3766 2377-3766 |
language | eng |
recordid | cdi_crossref_primary_10_1109_LRA_2021_3060721 |
source | IEEE Electronic Library (IEL) |
subjects | aerial systems Autonomous navigation Extended Kalman filter Feature extraction Ground stations Ground truth Indoor environments Laser radar Lidar Location awareness low-latency localization Matching Measurement by laser beam perception and autonomy Pose estimation Real-time systems Simultaneous localization and mapping SLAM Slicing Three dimensional models Three-dimensional displays Trajectory analysis Two dimensional models Unmanned aerial vehicles |
title | LoLa-SLAM: Low-Latency LiDAR SLAM Using Continuous Scan Slicing |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-23T13%3A14%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=LoLa-SLAM:%20Low-Latency%20LiDAR%20SLAM%20Using%20Continuous%20Scan%20Slicing&rft.jtitle=IEEE%20robotics%20and%20automation%20letters&rft.au=Karimi,%20Mojtaba&rft.date=2021-04-01&rft.volume=6&rft.issue=2&rft.spage=2248&rft.epage=2255&rft.pages=2248-2255&rft.issn=2377-3766&rft.eissn=2377-3766&rft.coden=IRALC6&rft_id=info:doi/10.1109/LRA.2021.3060721&rft_dat=%3Cproquest_RIE%3E2501946157%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2501946157&rft_id=info:pmid/&rft_ieee_id=9359468&rfr_iscdi=true |