Gestural hyper instrument collaboration with generative computation for real time creativity

  • Authors:
  • Kirsty Beilharz;Sam Ferguson

  • Affiliations:
  • University of Sydney, Sydney, UNK, Australia;University of Sydney, Sydney, UNK, Australia

  • Venue:
  • Proceedings of the 6th ACM SIGCHI conference on Creativity & cognition
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes the performance, mapping, transformation and representation phases of a model for gesture-triggered musical creativity. These phases are articulated in an example creative environment, Hyper-Shaku (Border-Crossing), an audio-visually augmented shakuhachi performance to demonstrate the adaptive, empathetic response of the generative systems. The shakuhachi is a Japanese traditional end-blown bamboo Zen flute. Its 5 holes and simple construction require subtle and complex gestural movements to produce its diverse range of pitches, vibrato and pitch inflections, making it an ideal candidate for gesture capture. The environment uses computer vision, gesture sensors and computer listening to process and generate electronic music and visualization in real time response to the live performer. The integration of looming auditory motion and Neural Oscillator Network (NOSC) generative modules are implemented in this example.