Score following: state of the art and new developments
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
A joint particle filter for audio-visual speaker tracking
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Music score alignment and computer accompaniment
Communications of the ACM - Music information retrieval
Computer Music Journal
Automatic Page Turning for Musicians via Real-Time Machine Listening
Proceedings of the 2008 conference on ECAI 2008: 18th European Conference on Artificial Intelligence
Monte Carlo methods for tempo tracking and rhythm quantization
Journal of Artificial Intelligence Research
An on-line time warping algorithm for tracking musical performances
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
A multimodal system for gesture recognition in interactive music performance
Computer Music Journal
Incremental polyphonic audio to score alignment using beat tracking for singer robots
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
A Coupled Duration-Focused Architecture for Real-Time Music-to-Score Alignment
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEA/AIE'10 Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part I
A tutorial on particle filters for online nonlinear/non-GaussianBayesian tracking
IEEE Transactions on Signal Processing
IEEE Transactions on Audio, Speech, and Language Processing
Hi-index | 0.00 |
Our goal is to develop a coplayer music robot capable of presenting a musical expression together with humans. Although many instrument-performing robots exist, they may have difficulty playing with human performers due to the lack of the synchronization function. The robot has to follow differences in humans' performance such as temporal fluctuations to play with human performers. We classify synchronization and musical expression into two levels: (1) melody level and (2) rhythm level to cope with erroneous synchronizations. The idea is as follows: When the synchronization with the melody is reliable, respond to the pitch the robot hears, when the synchronization is uncertain, try to follow the rhythm of the music. Our method estimates the score position for the melody level and the tempo for the rhythm level. The reliability of the score position estimation is extracted from the probability distribution of the score position. The experimental results demonstrate that our method outperforms the existing score following system in 16 songs out of 20 polyphonic songs. The error in the prediction of the score position is reduced by 69% on average. The results also revealed that the switching mechanism alleviates the error in the estimation of the score position.