BeatDance: A Beat-Based Model-Agnostic Contrastive Learning Framework for Music-Dance Retrieval
Dance and music are closely related forms of expression, with mutual retrieval between dance videos and music being a fundamental task in various fields like education, art, and sports. However, existing methods often suffer from unnatural generation effects or fail to fully explore the correlation...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Dance and music are closely related forms of expression, with mutual
retrieval between dance videos and music being a fundamental task in various
fields like education, art, and sports. However, existing methods often suffer
from unnatural generation effects or fail to fully explore the correlation
between music and dance. To overcome these challenges, we propose BeatDance, a
novel beat-based model-agnostic contrastive learning framework. BeatDance
incorporates a Beat-Aware Music-Dance InfoExtractor, a Trans-Temporal Beat
Blender, and a Beat-Enhanced Hubness Reducer to improve dance-music retrieval
performance by utilizing the alignment between music beats and dance movements.
We also introduce the Music-Dance (MD) dataset, a large-scale collection of
over 10,000 music-dance video pairs for training and testing. Experimental
results on the MD dataset demonstrate the superiority of our method over
existing baselines, achieving state-of-the-art performance. The code and
dataset will be made public available upon acceptance. |
---|---|
DOI: | 10.48550/arxiv.2310.10300 |