Legged Robot State-Estimation Through Combined Forward Kinematic and Preintegrated Contact Factors
State-of-the-art robotic perception systems have achieved sufficiently good performance using Inertial Measurement Units (IMUs), cameras, and nonlinear optimization techniques, that they are now being deployed as technologies. However, many of these methods rely significantly on vision and often fai...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | State-of-the-art robotic perception systems have achieved sufficiently good
performance using Inertial Measurement Units (IMUs), cameras, and nonlinear
optimization techniques, that they are now being deployed as technologies.
However, many of these methods rely significantly on vision and often fail when
visual tracking is lost due to lighting or scarcity of features. This paper
presents a state-estimation technique for legged robots that takes into account
the robot's kinematic model as well as its contact with the environment. We
introduce forward kinematic factors and preintegrated contact factors into a
factor graph framework that can be incrementally solved in real-time. The
forward kinematic factor relates the robot's base pose to a contact frame
through noisy encoder measurements. The preintegrated contact factor provides
odometry measurements of this contact frame while accounting for possible foot
slippage. Together, the two developed factors constrain the graph optimization
problem allowing the robot's trajectory to be estimated. The paper evaluates
the method using simulated and real sensory IMU and kinematic data from
experiments with a Cassie-series robot designed by Agility Robotics. These
preliminary experiments show that using the proposed method in addition to IMU
decreases drift and improves localization accuracy, suggesting that its use can
enable successful recovery from a loss of visual tracking. |
---|---|
DOI: | 10.48550/arxiv.1712.05873 |