Real-time adaptive control of modal synthesis

  • Authors:
  • Reynald Hoskinson;Kees van den Doel;Sidney Fels

  • Affiliations:
  • University of British Columbia, Vancouver, Canada;University of British Columbia, Vancouver, Canada;University of British Columbia, Vancouver, Canada

  • Venue:
  • NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
  • Year:
  • 2003

Quantified Score

Hi-index 0.01

Visualization

Abstract

We describe the design and implementation of an adaptive system to map control parameters to modal audio synthesis parameters in real-time. The modal parameters describe the linear response of a virtual vibrating solid, which is played as a musical instrument by a separate interface. The system uses a three layer feedforward backpropagation neural network which is trained by a discrete set of input-output examples. After training, the network extends the training set, which functions as the specification by example of the controller, to a continuous mapping allowing the real-time morphing of synthetic sound models.We have implemented a prototype application using a controller which collects data from a hand-drawn digital picture. The virtual instrument consists of a bank of modal resonators whose frequencies, dampings, and gains are the parameters we control. We train the system by providing pictorial representations of physical objects such as a bell or a lamp, and associate high quality modal models obtained from measurements on real objects with these inputs. After training, the user can draw pictures interactively and "play" modal models which provide interesting (though unrealistic) interpolations of the models from the training set in real-time.