Gesture interaction for electronic music performance
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
Hi-index | 0.00 |
In this paper, a method for controlling electronic digital music instruments is proposed, based on visual capture of baton and hand motion of a conductor. This approach is suitable for being applied in mixed ensembles of human musicians and electronic instruments. Computer vision methods that are well established, are used to track the motion of the baton and to deduce musical parameters (volume, pitch, expression) for the sound creation or cues for the time-synchronized replay of previously recorded music notation sequences (beat, tempo, expression). Combined with acoustic signal processing, this method can enable the automatic playing of a computer-based instrument in an orchestra, in which the conductor conducts both this instrument as well as the human musicians. This allows an intuitive control of the timing and expression towards a unique interpretation. In this paper, the concept is introduced and the feasibility is discussed.