Design issues in interaction modeling for free improvisation

  • Authors:
  • William Hsu

  • Affiliations:
  • San Francisco State University, San Francisco, CA

  • Venue:
  • NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In previous publications (see for example [2] and [3]), we described an interactive music system, designed to improvise with saxophonist John Butcher; our system analyzes timbral and gestural features in real-time, and uses this information to guide response generation. This paper overviews our recent work with the system's interaction management component (IMC). We explore several options for characterizing improvisation at a higher level, and managing decisions for interactive performance in a rich timbral environment. We developed a simple, efficient framework using a small number of features suggested by recent work in mood modeling in music. We describe and evaluate the first version of the IMC, which was used in performance at the Live Algorithms for Music (LAM) conference in December 2006. We touch on developments on the system since LAM, and discuss future plans to address perceived shortcomings in responsiveness, and the ability of the system to make long-term adaptations.