Touch Semantics for Intuitive Physical Manipulation of Humanoids
Rather than systematically programming joint or task trajectories, having a human physically manipulate the robot for direct adjustments is more intuitive, saves time, and increases usability, especially for nonexperts. Interactive motion generation or repositioning of humanoid robots through direct...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on human-machine systems 2022-12, Vol.52 (6), p.1111-1121 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Rather than systematically programming joint or task trajectories, having a human physically manipulate the robot for direct adjustments is more intuitive, saves time, and increases usability, especially for nonexperts. Interactive motion generation or repositioning of humanoid robots through direct human-touch manipulation is not an easy task, especially for high-level multijoint maneuvers. We propose a set of design rules for generating intuitive touch semantics called the "two-touch kinematic chain paradigm." Our method interprets user touch intentions to allow motions ranging from low-level single joint control to high-level whole-body task control with posture generation, stepping, and walking. The goal is to provide the user with an intuitive protocol for physical humanoid manipulation that can serve the purpose of any application. The generated set of touch semantics is embodied in a finite state machine-based framework using a task-space quadratic programming controller to interpret human touch using capacitive sensors embedded in the humanoid shell, and force-torque sensors located at the ankles and wrists. A position-controlled humanoid robot is used to assess the utility and function of our proposed touch semantics for physical manipulation. Furthermore, a user study with nonexperts examines how our approach is perceived in practice. |
---|---|
ISSN: | 2168-2291 2168-2305 |
DOI: | 10.1109/THMS.2022.3207699 |