Probabilistic Articulated Real-Time Tracking for Robot Manipulation
We propose a probabilistic filtering method which fuses joint measurements with depth images to yield a precise, real-time estimate of the end-effector pose in the camera frame. This avoids the need for frame transformations when using it in combination with visual object tracking methods. Precision...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We propose a probabilistic filtering method which fuses joint measurements
with depth images to yield a precise, real-time estimate of the end-effector
pose in the camera frame. This avoids the need for frame transformations when
using it in combination with visual object tracking methods.
Precision is achieved by modeling and correcting biases in the joint
measurements as well as inaccuracies in the robot model, such as poor extrinsic
camera calibration. We make our method computationally efficient through a
principled combination of Kalman filtering of the joint measurements and
asynchronous depth-image updates based on the Coordinate Particle Filter.
We quantitatively evaluate our approach on a dataset recorded from a real
robotic platform, annotated with ground truth from a motion capture system. We
show that our approach is robust and accurate even under challenging conditions
such as fast motion, significant and long-term occlusions, and time-varying
biases. We release the dataset along with open-source code of our approach to
allow for quantitative comparison with alternative approaches. |
---|---|
DOI: | 10.48550/arxiv.1610.04871 |