PMG-Net: Persian music genre classification using deep neural networks

•An annotated dataset consisting of 500 pieces of Persian music with five different genres, Rap, Traditional, Pop, Rock, and Monody, is introduced and made publicly available.•An efficient deep learning-based method to classify Persian music genres is introduced. Music genres can reveal our preferen...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Entertainment computing 2023-01, Vol.44, p.100518, Article 100518
Hauptverfasser: Farajzadeh, Nacer, Sadeghzadeh, Nima, Hashemzadeh, Mahdi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•An annotated dataset consisting of 500 pieces of Persian music with five different genres, Rap, Traditional, Pop, Rock, and Monody, is introduced and made publicly available.•An efficient deep learning-based method to classify Persian music genres is introduced. Music genres can reveal our preferences and are one of the main tools for retailers, libraries, and people to organize music. In addition, the music industry uses genres as a key method to define and target different markets, and thus, being able to categorize genres is an asset for marketing and music production. Several pieces of research have been done to classify western music genres, yet nothing has been done to classify Persian music genres so far. In this research, a tailored deep neural network-based method, termed PMG-Net, is introduced to automatically classify Persian music genres. Also, to assess the PMG-Net, a dataset, named PMG-Data, consisting of 500 music from different genres of Pop, Rap, Traditional, Rock, and Monody are collected and labeled, which is made publicly available for researchers. The accuracy obtained by PMG-Net on the PMG-Data is 86%, indicating an acceptable performance of the method compared with the existing deep neural network-based approaches.
ISSN:1875-9521
1875-953X
DOI:10.1016/j.entcom.2022.100518