Indexing Music by Mood: Design and Integration of an Automatic Content-based Annotator

TitleIndexing Music by Mood: Design and Integration of an Automatic Content-based Annotator
Publication TypeJournal Article
Year of Publication2010
AuthorsLaurier, C., Meyers O., Serrà J., Blech M., Herrera P., & Serra X.
Journal TitleMultimedia Tools and Applications
Journal Date05/2010
Keywordsmir, mood, pharos
AbstractIn the context of content analysis for indexing and retrieval, a method for creating automatic music mood annotation is presented. The method is based on results from psychological studies and framed into a supervised learning approach using musical features automatically extracted from the raw audio signal. We present here some of the most relevant audio features to solve this problem. A ground truth, used for training, is created using both social network information systems (wisdom of crowds) and individual experts (wisdom of the few). At the experimental level, we evaluate our approach on a database of 1000 songs. Tests of different classification methods, configurations and optimizations have been conducted, showing that Support Vector Machines perform best for the task at hand. Moreover, we evaluate the algorithm robustness against different audio compression schemes. This fact, often neglected, is fundamental to build a system that is usable in real conditions. In addition, the integration of a fast and scalable version of this technique with the European Project PHAROS is discussed. This real world application demonstrates the usability of this tool to annotate large-scale databases. We also report on a user evaluation in the context of the PHAROS search engine, asking people about the utility, interest and innovation of this technology in real world use cases.