A computer participant in musical improvisation
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Managing gesture and timbre for analysis and instrument control in an interactive environment
NIME '06 Proceedings of the 2006 conference on New interfaces for musical expression
Automatic mood detection and tracking of music audio signals
IEEE Transactions on Audio, Speech, and Language Processing
Hi-index | 0.00 |
In previous publications (see for example [2] and [3]), we described an interactive music system, designed to improvise with saxophonist John Butcher; our system analyzes timbral and gestural features in real-time, and uses this information to guide response generation. This paper overviews our recent work with the system's interaction management component (IMC). We explore several options for characterizing improvisation at a higher level, and managing decisions for interactive performance in a rich timbral environment. We developed a simple, efficient framework using a small number of features suggested by recent work in mood modeling in music. We describe and evaluate the first version of the IMC, which was used in performance at the Live Algorithms for Music (LAM) conference in December 2006. We touch on developments on the system since LAM, and discuss future plans to address perceived shortcomings in responsiveness, and the ability of the system to make long-term adaptations.