News and Events

PHAROS project finishes succesfully
The, succesful, final review of PHAROS took place at Orange Labs, Lannion, France on March 18th, 2010. PHAROS, Platform for searcH of Audiovisual Resources across Online Spaces, started in January 2007 and ended in December 2009. 

PHAROS was an Integrated Project co-financed by the European Union under the Information Society Technologies Programme (FP6) – Strategic Objective ‘Search Engines for Audiovisual Content’ (2.6.3). During the project, a complete innovative and scalable search platform was developped, integrated and evaluated. One of the main characteristics of the Pharos platform is the integration of automatic content analysis. In that context, the MTG developed web services to generate music annotations and search for similar music. This work was fully integrated and demoed at ICT 2008, SIGIR 2009, IBC 2009 and ISMIR 2009.

The main contributions of the MTG were the development of:

  • Music Annotation Web Service: Including features such as Mood, Danceability, Intensity, Live/Studio, Tempo and Key
  • Music Similarity Web Service: Scalable music similarity search based on high-level features
  • Multimodal Annotation Component: Enabling to merge different modalities, like demoed at ISMIR in the form of a Video Music Mood annotator

 

30 Mar 2010 - 13:10 | view
Reactable Live! is announced
Reactable Systems, a spin-off of the MTG, announces a new incarnation of the Reactable. This new version of the instrument is portable, easy to setup and built to endure heavy use on the road, it also includes new interactive graphics with a completely redesigned advanced high quality audio processing core. 

The Reactable Live! will be at the Musikmesse Frankfurt. Meet the team and try it in the hall 5.1 stand D94.

24 Mar 2010 - 11:22 | view
Open research position at the MTG
The Music Technology Group of the Universitat Pompeu Fabra offers a research position in the field of sound and gesture analysis. The work to be developed is linked to a EU-funded research project to be carried out in conjunction with a number of partners at a European level. Within the project, which approaches the study of creative communication within groups of people from the perspective of music ensemble performance and audience experience, expressive movement, audio, and physiological multi-layer features will be extracted from participants using real-time, synchronized, multi-modal feature extraction techniques, as they will become inputs for theoretical and computational models to be developed.

Among the foreseen tasks/duties to be carried out by the MTG as part of the project, those to be developed by the successful applicant will include:
  • Design, construction and maintenance of a large database/repository of multi-modal data recorded from real musical performance scenarios.
  • Development/implementation of multi-modal data acquisition and/or synchornization techniques.
  • Development/implementation of algorithms for the analysis of sound and gesture signals.
  • Writing of periodically deliverable technical reports.


Requirements:

  • Computer Science / Electrical Engineering (or related) degree.
  • Experience in signal processing applied to musical sound description.
  • Excellent programming skills in C++ / scripting.
  • Knowledge of system administration for heterogeneous, online-access database maintenance.
  • Previous experience with gesture/motion analysis is preferred.
  • Enthusiastic attitude towards approaching new research challenges.
  • Ability to work in a team-oriented environment.
  • Proficiency in English language.


This position is full-time, on-site in Barcelona. Traveling and attendance to project meetings or workshops might be required from time to time. The working language in our Lab is English. Previous experience in similar working environments will be valued.

Interested people should send a resume as well as an introduction letter to mtg-info [at] llista [dot] upf [dot] edu

24 Mar 2010 - 10:53 | view
Presentation of a new Phonos CD of Andrés Lewin-Ritcher
On March 18th at 19:30h at the Sala Mompou there will be the presentation of the new CD of music by Andrés Lewin-Ritcher produced by Phonos.
15 Mar 2010 - 10:36 | view
Seminar by R. Bargar and I. Choi on interactive authoring for media production

Robin Bargar and Insook Choi, from City University of New York, will give a seminar on Wednesday 10th of March at 18:00h in room 52.S31 on "Interactive Authoring with Semantic Reasoning for Heterogeneous Media Production".

Abstract: Computational models and intelligent signal processing are becoming mainstream methodologies in the media production chain. Media Authoring is a versatile practice that utilizes computational representations between interpreted languages and scripting. Robust models of computational architectures are becoming more feasible in media production as consumer platforms converge on standard processing capabilities that permit extensible applications in high-level programming languages with open source libraries. In this context we recently developed a working method and prototype system for authoring interactive media presentations supported by an ontological inference engine. The system supports real-time query across heterogeneous media resources, parallel media signal processing, and multiple display formats. Authoring is implemented as path-planning in ontological space; path members are concept nodes that generate queries and return media resources coupled to real-time displays. Ontology supports heterogeneous cross-referential capacity for media of multiple types. A dual-root-node data design links ontological reasoning with media metadata, which provides a method for defining hybrid semantic-quantitative relationships. Ontological organization enables users to author and explore media resources by concept-based navigation displaying relationships across media of diverse types, rather than isolating resource types.

Robin Bargar is Professor of Entertainment Technology and Insook Choi is the Director of the Emerging Media Technologies Program, both at the School of Technology and Design of City University of New York.

5 Mar 2010 - 14:02 | view
Presentation of a new Phonos CD of Julian Elviara
On March 4th at 19:30h at the Sala Mompou there will be the presentation of the CD by Julian Elvira produced by Phonos.
1 Mar 2010 - 17:19 | view
The Postgraduate Course in Interactive Systems has started
The Postgraduate Course in Design of Interactive Music Systems offered by IDEC in collaboration with the Escola Superior de Música de Catalunya has started for the first time this month. The director of the course is Sergi Jordà, who teaches in it together with Carles F. Julià, Daniel Gallardo and Mathieu Bosi.

The students enrolled in the course are: Arecio Gonzalez, Oriol Soler, Rafael Pérez, Ángel Andrés Cataño, Marcelo Rodríguez, Carlos Eduardo Cortés, Tomeu Coll, Jesús Gollonet, Daniel Cantero, Mario Pifarré, Aleix Fabra, and Victor Lloret.

This course is the second part of the Postgraduate Programme in Sonology. The first part, dedicated to Music Production Techniques has just finished.

24 Feb 2010 - 14:26 | view
Phonos Concert: Instruments and electronics
On Thursday 25th of February 2010 at 19:30h on the Espai Polivalent of the Tànger Building, Phonos Foundation organizes a concert including varios instrumental and electronic pieces.
23 Feb 2010 - 17:47 | view
Phonos Concert: Ellen Fallowfield, cello and electronics
On Tuesday 23rd of February 2010 at 19:30h on the Espai Polivalent of the Tànger Building, Phonos Foundation organizes a concert by Ellen Fallowfield performing pieces by cello and electronics.
19 Feb 2010 - 12:36 | view
Seminar by Michael Gurevich on "Style and Constraint in Electronic Musical Instruments"

Michael Gurevich, from the School of Music and Sonic Arts of Queen's University Belfast, will give a seminar on "Style and Constraint in Electronic Musical Instruments" on Wednesday 24th of February 2010 in room 52.S31 at 18h. This seminar takes place in the context of the Postgraduate Programme in Sonology.

Abstract:
Style is the ability to impart a unique and personal imprint on an activity; it is what distinguished a virtuosic performer from a highly skilled one. To experience a performance with style, one must be able to assess not only what the performer is doing, but how they are doing it, or how they are doing it differently than someone else would. But digital musical instruments, and digital devices in general, present significant challenges to the cultivation of style and our ability to perceive it. In this talk, I will present our concept of style and its impact in the user experience and the spectator experience, both in the context of music performance and in interactions with a wide array new technologies. I will report on the results of a recent study of performers that explored the relationship between constraint in design and the development of style. 

19 Feb 2010 - 11:00 | view
intranet