Visual feedback in performer-machine interaction for musical improvisation

  • Authors:
  • Alexandre R. J. François;Elaine Chew;Dennis Thurmond

  • Affiliations:
  • Univ. of Southern California, Los Angeles, CA;Univ. of Southern California, Los Angeles, CA;Univ. of Southern California, Los Angeles, CA

  • Venue:
  • NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes the design of Mimi, a multi-modal interactive musical improvisation system that explores the potential and powerful impact of visual feedback in performer-machine interaction. Mimi is a performer-centric tool designed for use in performance and teaching. Its key and novel component is its visual interface, designed to provide the performer with instantaneous and continuous information on the state of the system. For human improvisation, in which context and planning are paramount, the relevant state of the system extends to the near future and recent past. Mimi's visual interface allows for a peculiar blend of raw reflex typically associated with improvisation, and preparation and timing more closely affiliated with score-based reading. Mimi is not only an effective improvisation partner, it has also proven itself to be an invaluable platform through which to interrogate the mental models necessary for successful improvisation.