Online Evaluation for Learning Feasible Robotic Grasps With Physical Constraints
Existing grasp planning networks often learn from labeled images with grasp examples to eliminate the need for training through physical grasp attempts. As a result, trained networks lack an understanding of the physical constraints involved in successful grasps, leading to infeasible predictions an...
Gespeichert in:
Veröffentlicht in: | IEEE/ASME transactions on mechatronics 2024-09, p.1-12 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Existing grasp planning networks often learn from labeled images with grasp examples to eliminate the need for training through physical grasp attempts. As a result, trained networks lack an understanding of the physical constraints involved in successful grasps, leading to infeasible predictions and inaccurate evaluation. In this article, we propose a framework for integrating physical constraints, e.g., collision avoidance, into grasp learning through on-line grasp evaluation. During training, the proposed framework initially evaluates the feasibility of network predictions using physical constraints. Subsequently, physical supervision is generated based on both the feasible predictions and the geometries of the objects. In this manner, the network learns from its real-time errors and the object shape, in addition to labeled data. Experimental results demonstrated that our evaluation method achieved a significantly lower false rate (5.5%) than the commonly used metrics (intersection over union: 19.0%, SGT: 17.5%). Furthermore, the proposed framework effectively improves the network's real-world grasping success rate on EGAD objects by 18.7% for isolated objects (2450 attempts) and 15.8% for cluttered scenes (331 attempts). These results highlight the effectiveness of integrating physical constraints for feasible grasp prediction and accurate evaluation. |
---|---|
ISSN: | 1083-4435 1941-014X |
DOI: | 10.1109/TMECH.2024.3451228 |