News and Events

SMC students & former MTG member winners of an international musical Turing test

AlgoRhythms is a dance music Turing test for live DJ-ing, co-sponsored with Dartmouth College's Program in Digital Music and Darmouth’s Neukom Institute for Computational Science. AlgoRhythms was one of the three tests (the other two consisting of writing stories and sonnets) proposed to gauge the current intelligence that state-of-the-art creative software is able to demonstrate. 

The eight participants in AlgoRhythms submitted automatic DJing algorithms capable of creating a 15-minute DJ set, only using songs from a prior unseen music collection and an unknown seed song to start with. First, a selected jury of experts and afterwards a jury of listeners/dancers decided on the "human-likeness" of the 15-minute sessions. The system “DadaBots – Jungle Bot” made 50% of the experts believed it was human, whereas in the “field-test”, this one and “DJ Codo Nudo” reached the highest ratings (just 40% of “human-likeness”) and both shared the first price ($2000 each).

“DJ Codo Nudo” was submitted by two UPF Sound and Music Computing Master’s students, Jaume Parera & Pritish Chandna. Jaume is precisely developing his Master’s thesis on the topic of automatic DJing under the supervision of professors Sergi Jordà and Perfecto Herrera, and in connection with the Giantsteps EU project, targeted to research and develop on tools for music creation). On the other hand, “Dada Bots – Jungle Bot” was a system developed by CJ Carr + Zack Zukowski from Medford, MA, USA.

The second prize ($1000) was granted to a former MTG member, Gerard Roma, who submitted “Dub Life”. Gerard was for many years one of the main developers of Freesound. He finished his PhD thesis at the MTG last year and he is now working as research software developer in the Centre for Vision, Speech and Signal Processing of Surrey University.

More info about the contests:

http://bregman.dartmouth.edu/turingtests/

You can listen (and apparently vote, using Chrome or Safari) to the different sessions here:
http://bregman.dartmouth.edu/turingtests/DJPoll

You can soon read about state-of-the-art on Intelligent Music Systems in a special issue of the ACM Transactions on Intelligent Systems and Technologies, co-edited by Markus Schedl, Yi-HsuanYang and Perfecto Herrera.

20 May 2016 - 10:58 | view
Open PhD position on Information Retrieval Evaluation

The Department of Information and Communication Technologies, Universitat Pompeu Fabra in Barcelona is opening a PhD fellowship in the area of Information Retrieval Evaluation to start in the Fall of 2016.

Topics: evaluation, experimental design, datasets, statistics, user studies, reliability, music similarity, melody extraction.

Requirements: Candidates must have a good Masters Degree in Computer Science, Statistics or Mathematics. Candidates must be confident in some of these areas: information retrieval, machine learning, statistics, have excellent programming skills, be fluent in English and possess good communication skills. Musical knowledge is not necessary. Previous experience in research and a track record of publications is preferable.

Application closing date: 25/05/2016

Start date: 01/10/2016

Duration: 3+1 years

Research lab:  Music Information Research lab, Department of Information and Communication Technologies, Universitat Pompeu Fabra

Supervisors: Julián Urbano and Emilia Gómez

More information on grant details:

http://portal.upf.edu/web/etic/doctorat

http://portal.upf.edu/web/etic/predoctoral-research-contracts

Application: Interested candidates should send a motivation letter, a CV (preferably with references), and academic transcripts to Prof. Julián Urbano (julian [dot] urbano [at] upf [dot] edu) before May 20th 2016. Please include in the subject [PhD IR].The Department of Information and Communication Technologies, Universitat Pompeu Fabra in Barcelona is opening a PhD fellowship in the area of Information Retrieval Evaluation to start in the Fall of 2016.

Topics: evaluation, experimental design, datasets, statistics, user studies, reliability, music similarity, melody extraction.

Requirements: Candidates must have a good Masters Degree in Computer Science, Statistics or Mathematics. Candidates must be confident in some of these areas: information retrieval, machine learning, statistics, have excellent programming skills, be fluent in English and possess good communication skills. Musical knowledge is not necessary. Previous experience in research and a track record of publications is preferable.

Application closing date: 25/05/2016

Start date: 01/10/2016

Duration: 3+1 years

Research lab:  Music Information Research lab, Department of Information and Communication Technologies, Universitat Pompeu Fabra

Supervisors: Julián Urbano and Emilia Gómez

More information on grant details:

http://portal.upf.edu/web/etic/doctorat

http://portal.upf.edu/web/etic/predoctoral-research-contracts

Application: Interested candidates should send a motivation letter, a CV (preferably with references), and academic transcripts to Prof. Julián Urbano (julian [dot] urbano [at] upf [dot] edu) before May 20th 2016. Please include in the subject [PhD IR].

 

17 May 2016 - 12:39 | view
Melodic Contour and Mid-Level Global Features Applied to the Analysis of Flamenco Cantes - JNMR

A paper on melodic similarity in flamenco, co-authored by Emilia Gómez, has been published at the Journal of New Music Research, and is not available online.

Abstract
This work focuses on the topic of melodic characterization and similarity in a specific musical repertoire: a cappella flamenco singing, more specifically in debla and martinete styles. We propose the combination of manual and automatic description. First, we use a state-of-the-art automatic transcription method to account for general melodic similarity from music recordings. Second, we define a specific set of representative mid-level melodic features, which are manually labelled by flamenco experts. Both approaches are then contrasted and combined into a global similarity measure. This similarity measure is assessed by inspecting the clusters obtained through phylogenetic algorithms and by relating similarity to categorization in terms of style. Finally, we discuss the advantage of combining automatic and expert annotations as well as the need to include repertoire-specific descriptions for meaningful melodic characterization in traditional music collections.

This is the result of a joint work of the COFLA project, where the MTG is contributing with tecnologies for the automatic transcription and melody description of music recordings.

This is a video example of the type of styles we analyze in this paper, done by Nadine Kroher last year:

You can read the full paper online:

http://www.tandfonline.com/doi/full/10.1080/09298215.2016.1174717

28 Apr 2016 - 11:45 | view
II Premio DTIC-UPF al mejor Trabajo Final de Grado

Un año más, el Departamento de Tecnologías de la Información y lasComunicaciones de la Universidad Pompeu Fabra convoca el II Premio al mejor Trabajo Final de Grado (TFG) en TIC.

El premio está dirigido a estudiantes de grado que tengan previsto presentar su TFG en el curso académico 2015-2016 en un centro universitario español, público o privado. Los TFG deberán estar relacionados con la temática de los másters de investigación en TIC del Departamento (uno de ellos es el Master en Tecnologías del Sonido y de la Música).

Premios: este año se concederán dos premios, el Premio al Mejor Proyecto de Fin de Grado en España, dotado de 750 euros y matrícula gratuita en el máster que el/la candidato/a haya seleccionado en la solicitud (en caso de ser admitido/a); y  la Mención María de Maeztu para la Reproducibilidad en Software.

Plazo de presentación: hasta el 10 de junio de 2016

Bases e inscripción: www.upf.edu/etic/premisTFG

Más información: 

28 Apr 2016 - 11:01 | view
PhD fellowship on Music Information Retrieval at MTG

The Music Technology Group (MTG) of the Department of Information and Communication Technologies, Universitat Pompeu Fabra in Barcelona is opening a PhD fellowship in the area of Music Information Retrieval to start in the Fall of 2016.

Application closing date: 05/05/2016

Start date: 01/10/2016

Research lab:  Music Information Research lab, Music Technology Group, Department of Information and Communication Technologies, Universitat Pompeu Fabra
Supervisor: Emilia Gómez

Duration: 3+1 years

Topics: automatic transcription, sound source separation, music classification, singing voice processing, melody extraction, music synchronization, classical music, computational ethnomusicology.

Requirements: Candidates must have a good Master Degree in Computer Science, Electronic Engineering, Physics or Mathematics. Candidates must be confident in some of these areas: signal processing, information retrieval, machine learning, have excellent programming skills, be fluent in English and possess good communication skills. Musical knowledge would be an advantage, as would previous experience in research and a track record of publications.

More information on grant details:
http://portal.upf.edu/web/etic/doctorat
http://portal.upf.edu/web/etic/predoctoral-research-contracts
Provisional starting date: October 1st 2016

Application: Interested candidates should send a motivation letter, a CV (preferably with references), and academic transcripts to Prof. Emilia Gómez (emilia [dot] gomez [at] upf [dot] edu) before May 5th 2016. Please include in the subject [PhD MIR].

 

 

15 Apr 2016 - 14:50 | view
PhD Studentship in Technology Enhanced Learning of Music Instruments
PhD Studentship in Technology Enhanced Learning of Music Instruments
Application closing date: 22/04/2016
Start date: 01/09/2016
Research group: Music Technology Group, Department of Information and Communication Technologies, Universitat Pompeu Fabra
Duration: 3 years Years Funding available

Applications are invited for two fully funded PhD studentships at the Music Group, Universitat Pompeu Fabra, Barcelona, Spain undertaking research into Technology Enhanced Learning of Music Instruments.

TELMI is a joint project of 3 academic and 2 industry partners: Universitat Pompeu Fabra, Spain; Royal College of Music, UK; University of Genova, Italy; HIGHSKILLZ, UK; SAICO INTELLIGENCE, S.L. Spain. The aim of the project is to study how we learn musical instruments, taking the violin as a case study, from a pedagogical and scientific perspective and to create new interactive, assistive, self-learning, augmented-feedback, and social-aware systems complementary to traditional teaching. As a result of a tightly coupled interaction between technical and pedagogical partners, the project will attempt to answer questions such as “How will the musical instrument learning environments be in 5-10 years time?” and “What impact will these new musical environments have in instrument learning as a whole?”. More information a bout the project can be found at http://mtg.upf.edu/node/3367

The student will be a member of the Music Technology Group, Department of Information and Communication Technologies, Universitat Pompeu Fabra and will be supervised by Dr. Rafael Ramirez and Dr. Alfonso Perez-Carrillo. The successful candidate will pursue research at the intersection of audio and video signal processing, machine learning and cognitive sciences in the context of music performance pedagogy. The work will involve the development of multimodal signal processing algorithms, design of augmented visual feedback systems and the development of non-intrusive low-cost sensing systems for violin learning/teaching.

Candidates must have a good Master Degree in Computer Science, Electronic Engineering, Physics or Mathematics. Candidates must be confident in signal processing, have excellent programming skills, be fluent in English and possess good communication skills. Experience in machine learning and music performance would be an advantage, as would previous experience in research and a track record of publications. Interested candidates should apply by sending a full CV and a letter of interest to Dr. Alfonso Perez and Dr. Rafael Ramirez. Informal enquiries can be made by email to Dr. Alfonso Perez-Carrillo (alfonso [dot] perez [at] upf [dot] edu, http://www.dtic.upf.edu/~aperez/) and Dr. Rafael Ramirez (rafael [dot] ramirez [at] upf [dot] edu, http://www.dtic.upf.edu/~rramirez/).
5 Apr 2016 - 18:26 | view
CANTE: Open Algorithm, Code & Data for the Automatic Transcription of Flamenco Singing

The MTG has published CANTE: an Open Algorithm, Code & Data for the Automatic Transcription of Flamenco Singing.

The proposed system outperforms state of the art singing transcription systems with respect to voicing accuracy, onset detection, and overall performance when evaluated on flamenco singing datasets. We hope it think will be a contribution not only to flamenco research but to other singing styles.

You can read about our algorithm at the paper we published at IEEE TASP, where we present the method, strategies for evaluation and comparison with state of the art approaches. You can not only read, but actually try it, as we published an open source software for the algorithm, plus a music dataset for its comparative evaluation, cante100 (I will talk about flamenco corpus in another post). All of this to foster research reproducibility and motivate people to work on flamenco music.

¡Olé!

5 Apr 2016 - 10:20 | view
Announcing the Sónar Innovation Challenge
The MTG will not organise the Barcelona Music Hack Day this year, instead we are starting a new initiative in collaboration with the Sónar Festival: the Sónar Innovation Challenge (SIC).
 
The Sónar Innovation Challenge is in the same line than the MHD but a bit different. SIC is a platform for the collaboration between tech companies and creators (programmers, designers, artists...) that aims to produce innovative prototypes that will be showcased in Sonar+D. 
 
At SIC tech companies propose challenges for the creative community based on concrete needs for innovation. Creative coders, artists and designers who sign up for a challenge will have, after a selection process, the opportunity to work in a prototype within a unique collaborative environment, together with other challengers and company mentors.
 
4 Apr 2016 - 09:41 | view
Key Estimation in Electronic Dance Music, MTG presentation at ECIR 2016

This week the MTG is presenting some work at an oral session at the 38th European Conference on Information Retrieval (ECIR 2016), in Padua (IT), 20-23 March 2016.

Ángel Faraldo is presenting a paper titled "Key Estimation in Electronic Dance Music" written together with Emilia Gómez, Sergi Jordà and Perfecto Herrera. At such, it will be published in the Conference proceedings by Springer-Verlag.

21 Mar 2016 - 13:24 | view
Concert and demos: Phenicx project, classical music for the XXI century

Concert and demos to show the technologies developed as part of the Phenicx project on Wednesday 30th March 2016 at 19h in Arts Santa Mònica (Claustre), La Rambla 7, Barcelona.

The classical music approaches to new technologies through the project Phenicx, which proposes new ways to enjoy the experience of live music through innovative systems that allow to view and listen to a concert in a personalized way depending on the interests of the viewer.

After three years of international research, we invite the audience of Barcelona to discover the technologies developed in the frame of the PHENICX project with a special event that will include different activities:

  • Live demonstration of technology and classical music performance: three live music demos at the piano of ± 10 mins each. The audience will be introduced to various ways in which music practice can be enriched and facilitated thanks to novel PHENICX technologies. More specifically, the demos will consider three themes (exact time schedule to be published close to the event):
    • Discovering new aspects about music: in the 19th century, virtuoso pianists traveled Europe playing transcriptions of important works, so they got discovered by a broader audience. Using a piano transcription of Beethoven's 'The Creatures of Prometheus', we will now have you discover various musical layers and dimensions of this orchestral piece.
    • Music structure: ever wondered how musical themes connect to form a larger structure? We will unravel this and show you live how a piece develops the way it does.
    • Anytime performance tracking: fed up with carrying around heavy books of sheet music and having to turn pages at inconvenient moments? We will show you how state-of-the-art performance tracking technology can enable live score following, wherever you are in a piece---and literally bring a full music library at your fingertips this way.
  • A space for demos: discover the various components of our integrated prototype, offering an enriched experience of a concert. For example, listen to the different instruments of the orchestra, watch a live scrolling score, and browse symphonic work by considering its structural components.
  • Becoming the maestro: an interactive installation to simulate the role of a conductor and to understand the control of the different instruments and their interpretations through gestures.

In collaboration with

  • Multimedia Computing Group (MMC) - Technische Universiteit Delft
  • Department of Computational Perception (CP) - Johannes Kepler Universität Linz
  • Royal Concertgebouw Orchestra
  • Video Dock BV
  • Austrian Research Institute for Artificial Intelligence
  • Escola Superior de Música de Catalunya (ESMUC)

With the support of

European Commission, FP7 (Seventh Framework Programme)

 

 

 

14 Mar 2016 - 17:21 | view
intranet