Music Mood Annotator Design and Integration

TitleMusic Mood Annotator Design and Integration
Publication TypeConference Paper
Year of Publication2009
Conference Name7th International Workshop on Content-Based Multimedia Indexing
AuthorsLaurier, C., Meyers O., Serrà J., Blech M., & Herrera P.
Conference Start Date03/06/2009
Conference LocationChania, Crete, Greece
Keywordsclassification, emotion, mir, mood, Music, pharos
AbstractA robust and efficient technique for automatic music mood annotation is presented. A song's mood is expressed by a supervised machine learning approach based on musical features extracted from the raw audio signal. A ground truth, used for training, is created using both social network information systems and individual experts. Tests of 7 different classification configurations have been performed, showing that Support Vector Machines perform best for the task at hand. Moreover, we evaluate the algorithm robustness to different audio compression schemes. This fact, often neglected, is fundamental to build a system that is usable in real conditions. In addition, the integration of a fast and scalable version of this technique with the European Project PHAROS is discussed.
Published documentfiles/publications/Laurier_MusicMoodAnnotator.pdf
intranet