Linear Four-Point LiDAR SLAM for Manhattan World Environments
We present a new SLAM algorithm that utilizes an inexpensive four-point LiDAR to supplement the limitations of the short-range and viewing angles of RGB-D cameras. Herein, the four-point LiDAR can detect distances up to 40 m, and it senses only four distance measurements per scan. In open spaces, RG...
Gespeichert in:
Veröffentlicht in: | IEEE robotics and automation letters 2023-11, Vol.8 (11), p.7392-7399 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We present a new SLAM algorithm that utilizes an inexpensive four-point LiDAR to supplement the limitations of the short-range and viewing angles of RGB-D cameras. Herein, the four-point LiDAR can detect distances up to 40 m, and it senses only four distance measurements per scan. In open spaces, RGB-D SLAM approaches, such as L-SLAM, fail to estimate robust 6-DoF camera poses due to the limitations of the RGB-D camera. We detect walls beyond the range of RGB-D cameras using four-point LiDAR; subsequently, we build a reliable global Manhattan world (MW) map while simultaneously estimating 6-DoF camera poses. By leveraging the structural regularities of indoor MW environments, we overcome the challenge of SLAM with sparse sensing owing to the four-point LiDARs. We expand the application range of L-SLAM while preserving its strong performance, even in low-textured environments, using the linear Kalman filter (KF) framework. Our experiments in various indoor MW spaces, including open spaces, demonstrate that the performance of the proposed method is comparable to that of other state-of-the-art SLAM methods. |
---|---|
ISSN: | 2377-3766 2377-3766 |
DOI: | 10.1109/LRA.2023.3315205 |