News and Events

Korg releases a new Tuner with the collaboration of the MTG
Korg has announced the TM-50TR, a Tuner / Metronome / Tone Trainer device that detects not only the pitch, but also the volume and tone of the sound as a performer plays. The Tone Trainer function is based on KORG's new ARTISTRY technology. This is proprietary technology for analyzing and evaluating sound that was developed through cooperative research under the supervision of Xavier Serra, Director of the Music Technology Group at the Pompeu Fabra University in Barcelona, Spain. 
In addition to its high precision as a tuner, the TM-50TR features a new "Tone Trainer" function that can evaluate the players sound in even greater detail. When the performer plays a sustained note on her instrument, the TM-50TR will detect not only the pitch, but also the dynamics (volume) and brightness (tonal character). These three elements are displayed in the TM-50TR’s meter in real time. When the performer finishes playing the note, the stability of each of these three elements is shown in a graph, allowing you to see at a glance whether your sound is stable. 
By analyzing these three basic elements of sound, including tuning, the TM-50TR can identify which aspects of the performers playing need improvement, thus helping you practice more efficiently. 
25 Jul 2016 - 09:44 | view
Best paper award at NIME 2016

A paper presented by MTG researchers (Cárthach Ó Nuanáin, Sergi Jordà & Perfecto Herrera) has received the "best paper award" in the 16th International Conference on New Interfaces for Musical Expression, one of the most relevant and influential in the area of music technology, which was held recently in Brisbane, Australia.

The paper "An Interactive Software Instrument for Real-time Rhythmic Concatenative Synthesis" describes an approach for generating and visualising new rhythmic patterns from existing audio in real-time using concatenative synthesis. A graph-based model enables a novel 2-dimensional visualisation and manipulation of new patterns that mimic the rhythmic and timbral character of an existing target seed pattern. A VST audio plugin has been implemented using the reported research and has got positive acceptance not only in Brisbane's presentation but also in other non-academic meetings like Sonar+D and Music Tech Fest.

22 Jul 2016 - 15:29 | view
Keynote at IMS Conference 2016

Xavier Serra gives a keynote at the Conference of the International Musicological Society that takes place from July 1st to the 6th 2016 in Stavanger, Norway.

Title: The computational study of a musical culture through its digital traces

Abstract: From most musical cultures there are digital traces, digital artefacts, that can be processed and studied computationally and this has been the focus of computational musicology for already several decades. This type of research requires clear formalizations and some simplifications, for example, by considering that a musical culture can be conceptualized as a system of interconnected entities. A musician, an instrument, a performance, or a melodic motive, are examples of entities and they are linked through various types of relationships. We then need adequate digital traces of the entities, for example a textual description can be a useful trace of a musician and a recording a trace of a performance. The analytical study of these entities and of their interactions is accomplished by processing the digital traces and by generating mathematical representations and models of them. But a more ambitious goal is to go beyond the study of individual artefacts and analyze the overall system of interconnected entities in order to model a musical culture as a whole. The reader might think that this is science fiction, and she might be right, but there is research trying to advance in this direction. In this article we overview the challenges involved in this type of research and review some results obtained in various computational studies that we have carried out of several music cultures. In these studies, we have used audio signal processing, machine learning, and semantic web methodologies to describe various characteristics of the chosen musical cultures.
29 Jun 2016 - 23:09 | view
Best papers awards at FMA 2016 and at CBMI 2016

In the same week, two papers from the MTG obtained the best paper award in two conferences. Georgi Dzhambozov, first author, obtained the best paper award at FMA 2016 for the paper entitled "Automatic Alignment of Long Syllables In A cappella Beijing Opera".  Jordi Pons, first author, obtained the best paper award at CBMI 2016 for the paper entitled "Experimenting with Musically Motivated Convolutional Neural Networks".

24 Jun 2016 - 17:31 | view
Participation to the Data-driven Knowledge Extraction Workshop at UPF

Several members of the MTG present their research projects at the next María de Maeztu DTIC-UPF Data-driven Knowledge Extraction Workshop that takes place at the UPF on June 28th-29th 2016. The workshop is open to the public, free registration at

Here are the presentations with MTG participation:


21 Jun 2016 - 16:55 | view
Participation to CBMI 2016

Jordi Pons participates to the 14th International Workshop on Content-based Multimedia Indexing (CBMI 2016) that takes place in Bucharest from June 15th to the 17th 2016. He is presenting the following article:

15 Jun 2016 - 10:33 | view
Participation to Sonar+D 2016

As in the past years, the MTG participates in the Sonar Festival that takes place from June 16th to 18th, 2016 specifically in its professional area Sonar+D.

Our participation this year is focused on the following activities:

Sonar Innovation Challenge:

After 5 years of successfully organising the Barcelona Music Hack Day (MHD) in collaboration with Sonar+D, the MTG is now pushing forward a new activity within the festival: the Sónar Innovation Challenge (SIC).

The SIC is a platform for the collaboration between innovative tech companies and creators (programmers, designers, artists) that aims to produce disruptive prototypes to be showcased in the main stage of Sonar+D. The interaction between companies and creators happens through challenges proposed by the companies themselves, seeking to boost the impact and visibility of the featured technologies motivated by the market needs for innovation. Challenges are not exclusively technology driven, but also driven by content or artistic motivation.

In this first edition, the SIC hosts 4 challenges: Extended electronic music festival experience (Absolut Labs), Interactive playlist based on crowd behaviour (Deezer), Expressive gaming through gesture interaction (RAPID-MIX) and Collective smartphone experience (RAPID-MIX and CoSiMa).

We were truly thrilled by the quantity (over 100) and quality of the applications we got in this first edition. The Sónar Innovation Challenge has been designed to attract creators with a wide variety of profiles and skills, and from this perspective the Open Call has been completely successful. There is a great balance of artists, coders, makers, designers, researchers… the perfect combination to form great multidisciplinary teams.

The SIC started with an online phase where each team of challengers collaborated over the internet together with the mentors of their challenge in order to define their roles in the team, describe the team’s proposed solution, create a first prototype and prepare a work plan for the 3 intensive days of the on-site phase of the SIC that will take place from June 15th to June 17th, with a kick-off meeting at IronHack, and two more days of intensive work before presenting the outcomes of each challenge at Sonar+D.

Giant Steps Booth at Market Lab area:

As part of the dissemination activities of Giant Steps project, several prototypes and products developed during the project will be demoed in a booth dedicated to Giant Steps at the Market Lab area.

The MTG will present the “House harmonic filler”, an expert agent for harmony specialized in House music, and “Drumming with Style”, an expert agent for the variation and generation of rhythmic patterns. Reactable Systems will introduce ROTOR, a new app which, by allowing the use of tangible control objects on capacitive screens, brings the unique tangible experience of the Reactable tabletop for the first time to the iPad, and RhytmCat, a concatenative synthesis VST plugin, developed in collaboration with the MTG. Native Instruments will present iMaschine2 for iOS and JKU will present “Rhythm Variation".

Users will be able to play with all these applications, which will run synchronized in a shared session.

13 Jun 2016 - 14:35 | view
María de Maeztu DTIC-UPF Data-driven Knowledge Extraction Workshop
28 Jun 2016 - 29 Jun 2016

Presentation of DTIC-UPF research in the context of the Maria de Maeztu Strategic Research Program on Knowledge Extraction.

13 Jun 2016 - 09:32 | view
Seminar by Clarence Barlow on the use of speech analysis for the resynthesis by acoustic instruments
14 Jun 2016

On Tuesday, June 14th 2016, at 15:30h in room 55.410, Clarence Barlow, composer from UC Santa Barbara, gives a talk on: "On Synthrumentation - the Spectral Analysis of Speech for Subsequent Resynthesis by Acoustic Instruments".

Abstract: ‘Synthrumentation’ is a technique for the resynthesis of speech with acoustic instruments developed by the composer Clarence Barlow in the early 1980s. Over the past decade instrumental speech synthesis has been thematised by a diverse range of composers (e.g. Peter Ablinger or Jonathan Harvey); however, Barlow’s work is rarely accorded the credit it deserves for the pioneering role it played in this field. This presentation seeks to explain the basic mechanics of the synthrumentation technique and also demonstrate its practical application through an analysis of Barlow’s ensemble piece Im Januar am Nil composed between 1981 and 1984. It should become apparent that Barlow never uses synthrumentation in its conceptually pure form, but rather its realisation is always integrated into an overarching musical context, which reflects Barlow’s general approach to musical invention allowing different factors to interact.
8 Jun 2016 - 16:56 | view
Audiophiles Meetup
9 Jun 2016

An open platform & discussion group for the audiophiles in UPF

What is it ? & Objectives !
• An open platform and discussion group for the students to share and discuss their projects related to sound & music computing.
• Networking with the audio-junkies.
• To bring closer SMC master and Audio-Visual engineering undergraduate students in UPF.
• Discuss and share career prospects and research areas within Sound and Music computing.
• Introducing researchers and their projects at MTG.
• Involves open discussions, presentations, workshops, performances, etc.
• Moreover it is a casual gathering and the limitations of the meet-up are within your innovations.

Some topics to be discussed during the meeting
• ‘GlovAid’: a digital musical instrument using a glove.
New Interfaces for Musical Expression (NiME)
• ‘Freesound drumsets using unconventional sounds’.
Music Information Retrieval (MIR)

8 Jun 2016 - 14:20 | view