Using music to interact with a virtual character

  • Authors:
  • Robyn Taylor;Daniel Torres;Pierre Boulanger

  • Affiliations:
  • University of Alberta, Edmonton, Alberta, Canada;University of Alberta, Edmonton, Alberta, Canada;University of Alberta, Edmonton, Alberta, Canada

  • Venue:
  • NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a real-time system which allows musicians to interact with synthetic virtual characters as they perform. Using Max/MSP to parameterize keyboard and vocal input, meaningful features (pitch, amplitude, chord information, and vocal timbre) are extracted from live performance in real-time. These extracted musical features are then mapped to character behaviour in such a way that the musician's performance elicits a response from the virtual character. The system uses the ANIMUS framework to generate believable character expressions. Experimental results are presented for simple characters.