Head-tracking for gestural and continuous control of parameterized audio effects

  • Authors:
  • David Merrill

  • Affiliations:
  • MIT Media Lab, Cambridge, MA

  • Venue:
  • NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a system which uses the output from head-tracking and gesture recognition software to drive a parameterized guitar effects synthesizer in real-time.