Note: This bibliographic page is archived and will no longer be updated. For an up-to-date list of publications from the Music Technology Group see the Publications list .

Study of biomechamics in violin performances with kinect and its relationship with Sound

Title Study of biomechamics in violin performances with kinect and its relationship with Sound
Publication Type Master Thesis
Year of Publication 2017
Authors Fernandez Blanco, P.
Abstract Playing a musical instrument is a highly complex activity, which requires a combination of mental and sensorimotor skills acquired during a long learning trajectory. Traditional music performance pedagogy is mostly based on imitation and feedback on the performance through semantic metaphors and imagery that attempt to explain an acoustic quality or a physiological aspect of the performance such as the posture or the feeling of movement. Such an approach is based on subjective and vague perception that, in many cases, result in frustrating lack of progress to chronic problems, forced breaks from performing during extended periods or even career-ending injury. The long-term aim of this research is to study the scientific and technological challenges around the analysis and modeling of human motion in musical practice as a sensory-motor activity and involves the acquisition, description, storage and analysis of datasets of motion and audio data captured during real performances. More specifically, this work deals with the measurement and description of the kinematics and dynamics of the human upper-body motion during violin playing. A model of expert performers is built as a reference to compare with and automatically evaluate amateurs. Human body motion is typically measured with camera based systems or with wearable inertial sensors technology (e.g. accelerometer, gyroscope) attached to the body. Both techniques, along with bio-mechanical models, can be used to extract features such as joint angle, posture, and joint rotation speed as well as many others which could be used for improved performance coordination to reduce the risk of injury through the measurement and feedback of prolonged joint extension and angular rotation intensity. In this work, we aim to develop low-cost implementations, using off-the-shelf technology, namely RGB-D cameras, to make the system available to a large potential user group. In general, the accuracy of low-cost systems is low. However with the advent of recent consumer level 3D cameras such as the Microsoft Kinect, body motion tracking becomes feasible. Motion of the Violinist Upper-Body is acquired by means of a Microsoft Kinect Camera programmed using the provided C# SDK, which is able to track the position of the human body articulations and bones. Based on the position of these joints, we estimate two type of features. First, kinematic features such as angles between bones, joint positions, displacement, speed and acceleration and second dynamic features (i.e. muscle forces in motion). Dynamic features are estimated by fitting the previously computed kinematic features against a bio-mechanical model. Kinematics allows to analyze motion and pose of violin performers and dynamic features are important in order to detect over-use and help prevent possible muscular injuries. The fitting with the bio-mechanical model is done with the OpenSim software. Statistical analysis and machine learning techniques are used to build a reference model of expert performers that can be used to compare with amateurs and to give feedback and automatic evaluation of any performance. In this work we aim to characterize violin performances in terms of kinematic and dynamic features of the human body and to build a reference expert model that allows to automatically classify between experts and amateurs. By means of the reference model, we can additionally measure how close to the reference a new performance is and give as feedback an automatic rating of its quality. We have recorded a small database of experts, amateurs and non-violinists and our model is able to correctly classify 90% of the cases.