Effects of voice, instrumentation and production on the recognition of emotions communicated by pop-rock songs

TitleEffects of voice, instrumentation and production on the recognition of emotions communicated by pop-rock songs
Publication TypeConference Paper
Year of Publication2017
Conference NameI Congreso Internacional de Psicología de la Música y la Interpretación Musical
AuthorsNuñez, V., & Herrera P.
Conference Start Date05/10/2017
Conference LocationMadrid
AbstractWe report here a study on pop-rock music and the perceived emotions it can convey. In these studies, it is usually the case to present a short musical excerpt and ask listeners about the degree a certain emotion is expressed by that music. The used excerpts come from commercial “finished” productions, where tracks containing voices, different instruments, and production effects (e.g., equalization, reverberation, etc.) have been combined according to stylistic conventions or aesthetic ideas. Our approach, contrastingly, has taken advantage of existing collections of music files where the tracks were still unmixed. This way we have gathered emotional ratings for voice-only, instrumentation-only, voice-and-instrumentation, or all-mixed variants of the same “song”. Our research question addressed the possibility that these different components could convey emotional content differently. An online questionnaire was created with the help of PsyToolkit and 42 subjects (men and women, age between 16-65 years and diverse levels of musical experience and interests) rated 30 seconds’ excerpts of music that could belong to these categories: voice-only, instrumental-only, voice+instruments-only or full production. Emotion tags “happiness”, “sadness”, “surprise”, “irritation”, “contempt”, “love”, “positivity”, “negativity”, “animation” and “relaxation” were selected in order to capture some categorical and some dimensional aspects of music emotions. A 10-values discrete scale was used for rating the degree each one of the inquired emotions was present in each excerpt. Analyzing the results of the experiment, it has been found that it is more frequent to perceive positive (happiness, animation) than negative emotions (irritation, contempt), and that there is a statistically significant effect of the different conditions on the reported ratings. A practical application of the reported study has been the creation of a computational model for predicting expressed emotions in music, based on automatically computed music descriptors, and using the rated excerpts to train such a system. The details of this are, though, outside the scope of this presentation.
Final publicationhttp://www2.uned.es/psicologiaabierta/conpsimusica2017/index.htm
intranet