Mobile Robot Manipulation using Pure Object Detection
This paper addresses the problem of mobile robot manipulation using object detection. Our approach uses detection and control as complimentary functions that learn from real-world interactions. We develop an end-to-end manipulation method based solely on detection and introduce Task-focused Few-shot...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper addresses the problem of mobile robot manipulation using object
detection. Our approach uses detection and control as complimentary functions
that learn from real-world interactions. We develop an end-to-end manipulation
method based solely on detection and introduce Task-focused Few-shot Object
Detection (TFOD) to learn new objects and settings. Our robot collects its own
training data and automatically determines when to retrain detection to improve
performance across various subtasks (e.g., grasping). Notably, detection
training is low-cost, and our robot learns to manipulate new objects using as
few as four clicks of annotation. In physical experiments, our robot learns
visual control from a single click of annotation and a novel update
formulation, manipulates new objects in clutter and other mobile settings, and
achieves state-of-the-art results on an existing visual servo control and depth
estimation benchmark. Finally, we develop a TFOD Benchmark to support future
object detection research for robotics: https://github.com/griffbr/tfod. |
---|---|
DOI: | 10.48550/arxiv.2201.12437 |