Conducting Digitally Stored Music by Computer Vision Tracking

  • Authors:
  • Reinhold Behringer

  • Affiliations:
  • Leeds Metropolitan University

  • Venue:
  • AXMEDIS '05 Proceedings of the First International Conference on Automated Production of Cross Media Content for Multi-Channel Distribution
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a method for controlling electronic digital music instruments is proposed, based on visual capture of baton and hand motion of a conductor. This approach is suitable for being applied in mixed ensembles of human musicians and electronic instruments. Computer vision methods that are well established, are used to track the motion of the baton and to deduce musical parameters (volume, pitch, expression) for the sound creation or cues for the time-synchronized replay of previously recorded music notation sequences (beat, tempo, expression). Combined with acoustic signal processing, this method can enable the automatic playing of a computer-based instrument in an orchestra, in which the conductor conducts both this instrument as well as the human musicians. This allows an intuitive control of the timing and expression towards a unique interpretation. In this paper, the concept is introduced and the feasibility is discussed.