Real-time feature-based synthesis for live musical performance

  • Authors:
  • Matt Hoffman;Perry R. Cook

  • Affiliations:
  • Princeton University, Princeton, NJ;Princeton University, Princeton, NJ

  • Venue:
  • NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

A crucial set of decisions in digital musical instrument design deals with choosing mappings between parameters controlled by the performer and the synthesis algorithms that actually generate sound. Feature-based synthesis offers a way to parameterize audio synthesis in terms of the quantifiable perceptual characteristics, or features, the performer wishes the sound to take on. Techniques for accomplishing such mappings and enabling feature-based synthesis to be performed in real time are discussed. An example is given of how a real-time performance system might be designed to take advantage of feature-based synthesis's ability to provide perceptually meaningful control over a large number of synthesis parameters.