Learning Observation Model for Factor Graph Based-State Estimation Using Intrinsic Sensors

The navigation system of autonomous mobile robots has appeared challenging when using exteroceptive sensors such as cameras, LiDARs, and radars in textureless and structureless environments. This paper presents a robust state estimation system for holonomic mobile robots using intrinsic sensors base...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on automation science and engineering 2023-07, Vol.20 (3), p.2049-2062
Hauptverfasser: Van Nam, Dinh, Gon-Woo, Kim
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The navigation system of autonomous mobile robots has appeared challenging when using exteroceptive sensors such as cameras, LiDARs, and radars in textureless and structureless environments. This paper presents a robust state estimation system for holonomic mobile robots using intrinsic sensors based on adaptive factor graph optimization in the degradation scenarios. In particular, the neural networks are employed to learn the observation and noise model using only IMU sensor and wheel encoder data. Investigating the learning model for the holonomic mobile robot is discussed with various neural network architectures. We also explore the neural networks that are far more powerful and have cheaper computing power when using the inertial-wheel encoder sensors. Furthermore, we employ an industrial holonomic robot platform equipped with multiple LiDARs, cameras, IMU, and wheel encoders to conduct the experiments and create the ground truth without a bulky motion capture system. The collected datasets are then implemented to train the neural networks. Finally, the experimental evaluation presents that our solution provides better accuracy and real-time performance than other solutions. Note to Practitioners -Autonomous mobile robots need to serve in challenging environments robustly that deny extrinsic sensors such as cameras, LiDARs, and radars. In order to operate in this situation, the navigation system shall rely on the intrinsic sensor as inertial sensor and wheel encoders. Existing conventional methods have combined the intrinsic sensors in the form of the recursive Bayesian filtering technique without adapting these models. Besides, deep learning-based solutions have adopted extensive networks like LSTM or CNN to handle the estimation problem. This work aims to develop a state estimation subsystem of the navigation system for the holonomic mobile robots that utilize the intrinsic sensors in adaptive factor graph optimization. In particular, we present how to join the factor graph efficiently with the learning observation model of IMU and wheel encoder factor. Moreover, the neural networks are introduced to learn the observation model with an IMU and wheel encoder data inputs. We recognize that lightweight neural networks can achieve more power than deep learning techniques using the IMU sensor and wheel encoders. Finally, the neural networks are embedded in a factor graph to handle the smoothing state estimation. The proposed system could operate with hig
ISSN:1545-5955
1558-3783
DOI:10.1109/TASE.2022.3193411