Control parameters for musical instruments: a foundation for new mappings of gesture to sound

  • Authors:
  • Daniel J. Levitin;Stephen McAdams;Robert L. Adams

  • Affiliations:
  • Departments of Psychology and Music Theory, and Centre for Interdisciplinary Research in Music, Media and Tech nology (CIRMMT), McGill University, Montreal, Canada E-mail: levitin@psych.mcgill.ca;Institut de Recherche et Coordination Acoustique/Musique (IRCAM-CNRS), Paris, France;University of California at Davis, USA

  • Venue:
  • Organised Sound
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we describe a new way of thinking about musical tones, specifically in the context of how features of a sound might be controlled by computer musicians, and how those features might be most appropriately mapped onto musical controllers. Our approach is the consequence of one bias that we should reveal at the outset: we believe that electronically controlled (and this includes computer-controlled)musical instruments need to be emancipated from the keyboard metaphor; although piano-like keyboards are convenient and familiar, they limit the musician's expressiveness (Mathews 1991, Vertegaal and Eaglestone 1996, Paradiso 1997, Levitin and Adams 1998). This is especially true in the domain of computer music,in which timbres can be created that go far beyond the physical constraints of traditional acoustic instruments.