Advancing robust state estimation of wheeled robots in degenerate environments: harnessing ground manifold and motion states
State estimation is crucial for enabling autonomous mobility in mobile robots. However, traditional localization methods often falter in degraded environments, including issues like visual occlusion, lidar performance degradation, and global navigation satellite system signal interference. This pape...
Gespeichert in:
Veröffentlicht in: | Measurement science & technology 2024-04, Vol.35 (4), p.46308 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | State estimation is crucial for enabling autonomous mobility in mobile robots. However, traditional localization methods often falter in degraded environments, including issues like visual occlusion, lidar performance degradation, and global navigation satellite system signal interference. This paper presents a novel estimation approach for wheeled robots, exclusively utilizing proprioceptive sensors such as encoders and inertial measurement units (IMU). Initially, the motion manifolds extracted from the historical trajectories are used to assist the encoder in realizing the orientation estimation. Furthermore, a hybrid neural network is designed to categorize the robot’s operational state, and the corresponding pseudo-constraints are added to improve the estimation accuracy. We utilize an error state Kalman filter for the encoder and IMU data fusion. Lastly, comprehensive testing is conducted using both datasets and real-world robotic platforms. The findings underscore that the integration of manifold and motion constraints within our proposed state estimator substantially elevates accuracy compared to conventional approaches. Compare with the methods commonly used in engineering, the accuracy of this method is improved by more than 20%. Crucially, this methodology enables dependable estimation even in degraded environments. |
---|---|
ISSN: | 0957-0233 1361-6501 |
DOI: | 10.1088/1361-6501/ad1dad |