Gesture interaction for electronic music performance

  • Authors:
  • Reinhold Behringer

  • Affiliations:
  • Leeds Metropolitan University, Leeds, UK

  • Venue:
  • HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes an approach for a system which analyses an orchestra conductor in real-time, with the purpose of using the extracted information of time pace and expression for an automatic play of a computer-controlled instrument (synthesizer). The system in its final stage will use non-intrusive computer vision methods to track the hands of the conductor. The main challenge is to interpret the motion of the hand/baton/mouse as beats for the timeline. The current implementation uses mouse motion to simulate the movement of the baton. It allows to "conduct" a pre-stored MIDI file of a classical orchestral music work on a PC.