|Abstract||Physiology-based systems have lead to implicit interaction models where signals coming from the human body are used to control devices and applications by means different to muscular movement or speech. However, most of these systems are focused on single-user modes, and its application in collaborative scenarios is still scarce. In this project we present a system for collaborative sound generation and control built from a Hybrid Brain-Computer Interface device (BCI) featuring Electroencephalogram (EEG) and Electrocardiogram (ECG), and the reactable, a music instrument based on a Tangible User Interface (TUI). We assessed collaborative performance and motivational variables in a task-oriented experiment based on the imitation of pre-recorded sound references. Measures were obtained through self-reported questionnaires. Teams of subjects with no previous experience on the reactable used two different methods for sound generation and control: implicit interaction, through physiological signals (EEG & ECG), and implicit interaction by means of physical manipulation. The study has revealed four main effects of physiology-based interaction applied to a collaborative performance in a TUI. (1) Teams working with a combination of implicit and explicit models of interaction declared less difficulty and greater ease to solve the tasks. (2) They also shown higher levels of confidence during the performance. (3) The distribution of control and leadership was balance and didn’t show significant difference between the two proposed interaction paradigms. (4) Teams have shown a significant correlation in key aspects for collaboration, such as confidence and motivation over time. These findings suggest that the physiological signal extraction and processing implemented in this system could be linked to subtler descriptors related to affective states, emotional responses or music perception, allowing more advanced and stable methods for sound generation and control. We also propose to include previous training sessions, and carrying on experiments with musicians to test the expressiveness of the proposed system. This work presents a friendly configuration for a collaborative sound composition experience. It encourages the development of Computer-Supported Collaborative Systems where subtle sources of information such as physiological states effectively support an explicit model of interaction, as in the case of tangible tabletops interfaces, by means of non-invasive, wireless devices that preserve adequate conditions for live performance.