Towards Interactive Music Generation: A Position Paper

Music generation using deep learning has received considerable attention in recent years. Researchers have developed various generative models capable of cloning musical conventions, comprehending the musical corpora, and generating new samples based on the learning outcome. Although the samples gen...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, p.1-1
Hauptverfasser: Dadman, Shayan, Bremdal, Bernt Arild, Bang, Borre, Dalmo, Rune
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Music generation using deep learning has received considerable attention in recent years. Researchers have developed various generative models capable of cloning musical conventions, comprehending the musical corpora, and generating new samples based on the learning outcome. Although the samples generated by these models are persuasive, they often lack musical structure and creativity. Moreover, a vanilla end-to-end approach, which deals with all levels of music representation at once, does not offer human-level control and interaction during the learning process, leading to constrained results. Indeed, music creation is a recurrent process that follows some principles by a musician, where various musical features are reused or adapted. On the other hand, a musical piece adheres to a musical style, breaking down into precise concepts of timbre style, performance style, composition style, and the coherency between these aspects. Here, we study and analyze the current advances in music generation using deep learning models through different criteria. We discuss the shortcomings and limitations of these models regarding interactivity and adaptability. Finally, we draw the potential future research direction addressing multi-agent systems and reinforcement learning algorithms to alleviate these shortcomings and limitations.
ISSN:2169-3536
DOI:10.1109/ACCESS.2022.3225689