Hindsight States: Blending Sim and Real Task Elements for Efficient Reinforcement Learning
Reinforcement learning has shown great potential in solving complex tasks when large amounts of data can be generated with little effort. In robotics, one approach to generate training data builds on simulations based on dynamics models derived from first principles. However, for tasks that, for ins...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Reinforcement learning has shown great potential in solving complex tasks
when large amounts of data can be generated with little effort. In robotics,
one approach to generate training data builds on simulations based on dynamics
models derived from first principles. However, for tasks that, for instance,
involve complex soft robots, devising such models is substantially more
challenging. Being able to train effectively in increasingly complicated
scenarios with reinforcement learning enables to take advantage of complex
systems such as soft robots. Here, we leverage the imbalance in complexity of
the dynamics to learn more sample-efficiently. We (i) abstract the task into
distinct components, (ii) off-load the simple dynamics parts into the
simulation, and (iii) multiply these virtual parts to generate more data in
hindsight. Our new method, Hindsight States (HiS), uses this data and selects
the most useful transitions for training. It can be used with an arbitrary
off-policy algorithm. We validate our method on several challenging simulated
tasks and demonstrate that it improves learning both alone and when combined
with an existing hindsight algorithm, Hindsight Experience Replay (HER).
Finally, we evaluate HiS on a physical system and show that it boosts
performance on a complex table tennis task with a muscular robot. Videos and
code of the experiments can be found on webdav.tuebingen.mpg.de/his/. |
---|---|
DOI: | 10.48550/arxiv.2303.02234 |