Grasping Trajectory Optimization with Point Clouds
We introduce a new trajectory optimization method for robotic grasping based on a point-cloud representation of robots and task spaces. In our method, robots are represented by 3D points on their link surfaces. The task space of a robot is represented by a point cloud that can be obtained from depth...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We introduce a new trajectory optimization method for robotic grasping based
on a point-cloud representation of robots and task spaces. In our method,
robots are represented by 3D points on their link surfaces. The task space of a
robot is represented by a point cloud that can be obtained from depth sensors.
Using the point-cloud representation, goal reaching in grasping can be
formulated as point matching, while collision avoidance can be efficiently
achieved by querying the signed distance values of the robot points in the
signed distance field of the scene points. Consequently, a constrained
nonlinear optimization problem is formulated to solve the joint motion and
grasp planning problem. The advantage of our method is that the point-cloud
representation is general to be used with any robot in any environment. We
demonstrate the effectiveness of our method by performing experiments on a
tabletop scene and a shelf scene for grasping with a Fetch mobile manipulator
and a Franka Panda arm. The project page is available at
\url{https://irvlutd.github.io/GraspTrajOpt} |
---|---|
DOI: | 10.48550/arxiv.2403.05466 |