Data-Driven Techniques for Music Genre Recognition

Sergio Santiago Renteria Aguilar, Leopoldo Llano, Javier Cantú-Ortiz

Research output: Chapter in Book/Conference paperConference paperpeer-review


After the digital revolution, it is not strange to see data science taking interest in music. The sheer amount of available content opens a plethora of possibilities for studying music and its social impact from a data analytic perspective. This paper studies the relationship that exists between, song features and their corresponding genre, to provide data-mining tools for music recommendation and sub-genre identification. For the first task, we compared different classification models, including Random Forests, Fully-connected neural networks and Logistic Regression. For the latter, we carried out cluster analysis and dimensionality reduction for data visualisation. Overall, Random Forest models had better performance in genre classification than Fully-connected networks, but they suffered from overfitting. Moreover, the highest accuracy obtained was too low (64%) to be of use for genre recognition applications. Nevertheless, we think our results show the limitations of hand-crafted features and point towards more sophisticated deep learning techniques.
Original languageEnglish
Title of host publicationComputer Science & Information Technology
Subtitle of host publication9th International Conference on Advanced Information Technologies and Applications (ICAITA 2020)
EditorsDavid C. Wyld, Dhinaharan Nagamalai
PublisherAcademy and Industry Research Collaboration Center (AIRCC)
Publication statusPublished - 11 Jul 2020
Externally publishedYes
Event9th International Conference on Advanced Information Technologies and Applications - Toronto, Canada
Duration: 11 Jul 202012 Jul 2020


Conference9th International Conference on Advanced Information Technologies and Applications
Abbreviated titleICAITA 2020


Dive into the research topics of 'Data-Driven Techniques for Music Genre Recognition'. Together they form a unique fingerprint.

Cite this