Learning to Generate Diverse Dance Motions with Transformer
With the ongoing pandemic, virtual concerts and live events using digitized performances of musicians are getting traction on massive multiplayer online worlds. However, well choreographed dance movements are extremely complex to animate and would involve an expensive and tedious production process....
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the ongoing pandemic, virtual concerts and live events using digitized
performances of musicians are getting traction on massive multiplayer online
worlds. However, well choreographed dance movements are extremely complex to
animate and would involve an expensive and tedious production process. In
addition to the use of complex motion capture systems, it typically requires a
collaborative effort between animators, dancers, and choreographers. We
introduce a complete system for dance motion synthesis, which can generate
complex and highly diverse dance sequences given an input music sequence. As
motion capture data is limited for the range of dance motions and styles, we
introduce a massive dance motion data set that is created from YouTube videos.
We also present a novel two-stream motion transformer generative model, which
can generate motion sequences with high flexibility. We also introduce new
evaluation metrics for the quality of synthesized dance motions, and
demonstrate that our system can outperform state-of-the-art methods. Our system
provides high-quality animations suitable for large crowds for virtual concerts
and can also be used as reference for professional animation pipelines. Most
importantly, we show that vast online videos can be effective in training dance
motion models. |
---|---|
DOI: | 10.48550/arxiv.2008.08171 |