Expressive Whole-Body Control for Humanoid Robots
Can we enable humanoid robots to generate rich, diverse, and expressive motions in the real world? We propose to learn a whole-body control policy on a human-sized robot to mimic human motions as realistic as possible. To train such a policy, we leverage the large-scale human motion capture data fro...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Can we enable humanoid robots to generate rich, diverse, and expressive
motions in the real world? We propose to learn a whole-body control policy on a
human-sized robot to mimic human motions as realistic as possible. To train
such a policy, we leverage the large-scale human motion capture data from the
graphics community in a Reinforcement Learning framework. However, directly
performing imitation learning with the motion capture dataset would not work on
the real humanoid robot, given the large gap in degrees of freedom and physical
capabilities. Our method Expressive Whole-Body Control (Exbody) tackles this
problem by encouraging the upper humanoid body to imitate a reference motion,
while relaxing the imitation constraint on its two legs and only requiring them
to follow a given velocity robustly. With training in simulation and Sim2Real
transfer, our policy can control a humanoid robot to walk in different styles,
shake hands with humans, and even dance with a human in the real world. We
conduct extensive studies and comparisons on diverse motions in both simulation
and the real world to show the effectiveness of our approach. |
---|---|
DOI: | 10.48550/arxiv.2402.16796 |