#MUSICBRICKS

#MusicBricks

#MusicBricks
Musical Building Blocks for Digital Makers and Content Creators

Project overview

MusicBricks was an Innovation Action project under the EC Horizon2020 Program (01/2015-06/2016). The project was coordinated by Stromatolite (led by Michela Magas), and counted on four music technology research labs in the consortium (IRCAM, Fraunhofer IDMT, TU Wien an MTG-UPF). The main goal of MusicBricks was to show how Music Technology can be good testbed for ‘innovation’, proposing a methodology to bring research results to society, in form of new products or cultural productions.

Main results:

First, four research labs (IRCAM, Fraunhofer IDMT, TU Wien an MTG-UPF) provided a collection of research tools, compiled in the #MusicBricks Toolkit. This toolkit was used by creative communities in a number of “Creative Testbeds”, i.e. hackatons events, including the MusicHackDay in Barcelona in 2015 (organized by MTG-UPF), and three editions of the MusicTechFest (Sweden, Slovenja and Berlin).

Eleven hack projects were selected in the Creative Testbeds passing through an incubation phase (“Industrial Testbed”) for further development with the support of research partners. The results of all the projects can be found here http://musictechfest.net/projects/. Some teams include current and former MTG researchers.

In the final phase, the “Market Testbed”, some of the project were mentored to further work on their projects towards an end-product. At the end of MusicBricks some os this incubated project resulted in end-products for example, either as a patent-pending commercial industrial product (Dolphin), a fully working demo prototype (Sound In Translation  and Snitch), or a performance production (#FSBS).

MTG's role and contributions:

- Essentia has been tested and further expanded following some features and requests by users. For example, the EssentiaRT~ modules[1], which are a partial real-time implementation of the library, has been widely tested and used in the project hackatons. Additionally, a number of new sound synthesis algorithms have been added to use the library also to generate and process sounds [2]. 

[1] https://www.upf.edu/web/mtg/essentiart
[2] https://github.com/MTG/essentia/tree/master/src/algorithms/synthesis

- In the context of the project, we develop also a demo application that integrated the FreeSound API in a voice-controlled drum machine. Hands-Free Sound Machine is a voice and gesture controlled drum machine built using HTML5, WebAudio API, WebSpeech API and the Freesound API. The drum machine can be controlled using your voice, telling it to start, stop, set the tempo and search sounds for each pad. Sounds are retrieved from Freesound. IRCAM’s riot sensor is used to activate and deactivate steps from the sequencer and to set the tempo. Hands-Free Sound Machine source code: https://github.com/MTG/hands-free-sound-machine.

- Hands-Free Sound Machine video: