conga: a framework for adaptive conducting gesture analysis

  • Authors:
  • Eric Lee;Ingo Grüll;Henning Kiel;Jan Borchers

  • Affiliations:
  • RWTH Aachen University, Aachen, Germany;RWTH Aachen University, Aachen, Germany;RWTH Aachen University, Aachen, Germany;RWTH Aachen University, Aachen, Germany

  • Venue:
  • NIME '06 Proceedings of the 2006 conference on New interfaces for musical expression
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Designing a conducting gesture analysis system for public spaces poses unique challenges. We present conga, a software framework that enables automatic recognition and interpretation of conducting gestures. conga is able to recognize multiple types of gestures with varying levels of difficulty for the user to perform, from a standard four-beat pattern, to simplified up-down conducting movements, to no pattern at all. conga provides an extendable library of feature detectors linked together into a directed acyclic graph; these graphs represent the various conducting patterns as gesture profiles. At run-time, conga searches for the best profile to match a user's gestures in real-time, and uses a beat prediction algorithm to provide results at the sub-beat level, in addition to output values such as tempo, gesture size, and the gesture's geometric center. Unlike some previous approaches, conga does not need to be trained with sample data before use. Our preliminary user tests show that conga has a beat recognition rate of over 90%. conga is deployed as the gesture recognition system for Maestro!, an interactive conducting exhibit that opened in the Betty Brinn Children's Museum in Milwaukee, USA in March 2006.