Biblio
Filters: Author is Rafael Ramirez [Clear All Filters]
A Brain-Gaze Controlled Musical Interface.
Berlin BCI Workshop 2012 - Advances in Neurotechnology.
(2012).
A PERCUSSIVE MUSICAL INTERFACE FOR A QUADRIPLEGIC PATIENT.
International Symposium on Performance Science 2015. 60-61.
(2015).
Is an auditory P300-based Brain-Computer Musical Interface feasible?.
CMMR2015: International Workshop on BCMI.
(2015).
Towards a Low Cost Mu-Rhythm Based BCI.
Fifth International Brain-Computer Interface Meeting 2013.
(2013).
EEG Signal classification in a Brain-Computer Music Interface.
8th International Workshop on Machine Learning and Music. 28-30.
(2015).
THE EYEHARP: AN EYE-TRACKING-BASED MUSICAL INSTRUMENT.
8th Sound and Music Computing Conference.
(2011).
Digital Musical Instruments for People with Physical Disabilities.
Department of Information and Communication Technologies. 163.
(2016).
The EyeHarp: A Gaze-Controlled Digital Musical Instrument.
Frontiers in Psychology . 7,
(2016).
Temporal Control In the EyeHarp Gaze-Controlled Musical Interface.
New Interfaces for Musical Expression (NIME) 2012. Abstract
(2012).
P300 Harmonies: A Brain-Computer Musical Interface.
International Computer Music Conference/Sound and Music Computing Conference. Abstract
(2014).
A High-Throughput Auditory P300 Interface for Everyone.
12th European AAATE Conference. 478-482. Abstract
(2013).
(2003).
Training a Classifier to Detect Instantaneous Musical Cognitive States.
International Conference on Music Perception and Cognition. Abstract
(2006).
An fMRI Study on Attentive Music Listening.
The Neurosciences and Music. Abstract
(2008).
Intra-note Features Prediction Model for Jazz Saxophone Performance.
International Computer Music Conference. Abstract
(2005).
A Sequential Covering Evolutionary Algorithm for Expressive Music Performance.
Conference on Innovative Applications of Artificial Intelligence. Abstract
(2006).
A machine learning approach to expressive performance in jazz standards.
ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. Abstract
(2004).
A Tool for Generating and Explaining Expressive Music Performances of Monophonic Jazz Melodies.
International Journal on Artificial Intelligence Tools. 15, 673-691. Abstract
(2006).
Modeling Expressive Music Performance in Bassoon Audio Recordings.
Intelligent Computing in Signal Processing and Pattern Recognition. 345, 951-957.
(2006).
Discovering Expressive Transformation Rules from Saxophone Jazz Performances.
Journal of New Music Research. 34, 319-330. Abstract
(2005).
A Rule-Based Evolutionary Approach to Music Performance Modeling.
IEEE Transactions on Evolutionary Computation. 16(1), 96 - 107. Abstract
(2012).
Modeling Violin Performances Using Inductive Logic Programming.
Intelligent Data Analysis.
(2010).
An Approach to Expressive Music Performance Modeling.
118th Audio Engineering Society Convention. Abstract
(2005).