Tutorial on Music Autotagging

TitleTutorial on Music Autotagging
Publication TypeMiscellaneous
Year of Publication2013
AuthorsSordo, M., & Coviello E.
AbstractTechnology is revolutionizing the way in which music is distributed and consumed. As a result, millions of songs are instantly available to millions of people, on the Internet. This has created the need for novel music search and discovery technologies, to help users find a "mellow Beatles song" on a nostalgic night, "scary Halloween music" on October 31st, or address a sudden desire for "romantic jazz with saxophone and deep male vocals", without knowing an appropriate artist or song title. One important task in the realization of a music search engine is the automatic annotation of music with descriptive keywords, or tags, based on the audio content of the song. Music annotations can be used for a variety of purposes, such as searching for songs exhibiting specific qualities (e.g., jazz songs with female vocals and saxophone), or retrieval of semantically similar songs (e.g., for generating playlists). In this tutorial we look at the current state-of-the-art in content-based automatic music tagging. We cover the challenges in building an automatic music tagger, ranging from ground truth collection, statistical modeling, inference, and evaluation. We examine past and recent work, and discuss advantages and disadvantages of the various solutions. We finally put emphasis on the most recent discoveries.
preprint/postprint documenthttp://www.dtic.upf.edu/~msordo/ISMIR2013/tutorial/slides.zip
intranet