The musical interface technology design space

  • Authors:
  • Dan Overholt

  • Affiliations:
  • Aalborg university, department of media technology, section for medialogy, niels jernes vej 14, dk-9220 aalborg, denmark

  • Venue:
  • Organised Sound
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article presents a theoretical framework for the design of expressive musical instruments, the Musical Interface Technology Design Space: MITDS. The activities of imagining, designing and building new musical instruments, performing, composing, and improvising with them, and analysing the whole process in an effort to better understand the interface, our physical and cognitive associations with it, and the relationship between performer, instrument and audience can only be seen as an ongoing iterative work-in-progress. It is long-term evolutionary research, as each generation of a new musical instrument requires inventiveness and years of dedication towards the practice and mastery of its performance system (comprising the interface, synthesis and the mappings between them). Many revisions of the system may be required in order to develop musical interface technologies that enable us to achieve truly expressive performances. The MITDS provides a conceptual framework for describing, analysing, designing and extending the interfaces, mappings, synthesis algorithms and performance techniques for interactive musical instruments. It provides designers with a theoretical base to draw upon when creating technologically advanced performance systems, and can be seen as a set of guidelines for analysis, and a taxonomy of design patterns for interactivity in musical instruments. The MITDS focuses mainly on human-centred design approaches to realtime control of the multidimensional parameter spaces in musical composition and performance, where the primary objective is to close the gap between human gestures and complex synthesis methods.