Biblio
Filters: Author is Rafael Ramirez [Clear All Filters]
Temporal Control In the EyeHarp Gaze-Controlled Musical Interface.
New Interfaces for Musical Expression (NIME) 2012. Abstract
(2012).
P300 Harmonies: A Brain-Computer Musical Interface.
International Computer Music Conference/Sound and Music Computing Conference. Abstract
(2014).
A High-Throughput Auditory P300 Interface for Everyone.
12th European AAATE Conference. 478-482. Abstract
(2013).
A PERCUSSIVE MUSICAL INTERFACE FOR A QUADRIPLEGIC PATIENT.
International Symposium on Performance Science 2015. 60-61.
(2015).
A Brain-Gaze Controlled Musical Interface.
Berlin BCI Workshop 2012 - Advances in Neurotechnology.
(2012).
Is an auditory P300-based Brain-Computer Musical Interface feasible?.
CMMR2015: International Workshop on BCMI.
(2015).
Towards a Low Cost Mu-Rhythm Based BCI.
Fifth International Brain-Computer Interface Meeting 2013.
(2013).
EEG Signal classification in a Brain-Computer Music Interface.
8th International Workshop on Machine Learning and Music. 28-30.
(2015).
THE EYEHARP: AN EYE-TRACKING-BASED MUSICAL INSTRUMENT.
8th Sound and Music Computing Conference.
(2011).
Digital Musical Instruments for People with Physical Disabilities.
Department of Information and Communication Technologies. 163.
(2016).
The EyeHarp: A Gaze-Controlled Digital Musical Instrument.
Frontiers in Psychology . 7,
(2016).
Automatic Performer Identification in Celtic Violin Audio Recordings.
Journal of New Music Research. 40(2), 165-174. Abstract
(2011).
Modeling Celtic Violin Expressive Performance.
International Workshop on Machine Learning and Music, International Conference on Machine Learning. Abstract
(2008).
Automatic performer identification in commercial monophonic Jazz performances.
Pattern Recognition Letters. 31(12), 1514-1523.
(2010).
A Genetic Rule-based Expressive Performance Model for Jazz Saxophone.
Computer Music Journal. 32, 38-50. Abstract
(2008).
A Framework for Performer Identification in Audio Recordings.
International Workshop on Machine Learning and Music - ECML-PKDD 09. Abstract
(2009).
Modeling expressive music performance in jazz.
International Florida Artificial Intelligence Research Society Conference. Abstract
(2005).
Identifying saxophonists from their playing styles.
30th AES Conference. Abstract
(2007).
Performer Identification in Celtic Violin Recordings.
International Conference on Music Information Retrieval. Abstract
(2008).
Understanding expressive transformations in saxophone jazz performances using inductive machine learning.
Sound and Music Computing Conference. Abstract
(2004).
A Machine Learning Approach to Expressive Performance in Jazz Standards.
( , Ed.).Multimedia Data Mining and Knowledge Discovery. Abstract
(2006).
(2003).
Training a Classifier to Detect Instantaneous Musical Cognitive States.
International Conference on Music Perception and Cognition. Abstract
(2006).
An fMRI Study on Attentive Music Listening.
The Neurosciences and Music. Abstract
(2008).
Intra-note Features Prediction Model for Jazz Saxophone Performance.
International Computer Music Conference. Abstract
(2005).