Resonant processing of instrumental sound controlled by spatial position

  • Authors:
  • Camille Goudeseune;Guy Garnett;Timothy Johnson

  • Affiliations:
  • University of Illinois at Urbana-Champaign, Urbana, IL;University of Illinois at Urbana-Champaign, Urbana, IL;University of Illinois at Urbana-Champaign, Urbana, IL

  • Venue:
  • NIME '01 Proceedings of the 2001 conference on New interfaces for musical expression
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present an acoustic musical instrument played through a resonance model of another sound. The resonance model is controlled in real time as part of the composite instrument. Our implementation uses an electric violin, whose spatial position modifies filter parameters of the resonance model. Simplicial interpolation defines the mapping from spatial position to filter parameters. With some effort, pitch tracking can also control the filter parameters. The individual technologies -- motion tracking, pitch tracking, resonance models -- are easily adapted to other instruments.