Designing virtual instruments with touch-enabled interface

  • Authors:
  • Zhimin Ren;Ravish Mehra;Jason Coposky;Ming Lin

  • Affiliations:
  • University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA;University of North Carolina, Chapel Hill, Chapel Hill, North Carolina, USA;Renaissance Computing Institute, Chapel Hill, North Carolina, USA;University of North Carolina, Chapel Hill, Chapel Hill, North Carolina, USA

  • Venue:
  • CHI '12 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present and discuss the design of a virtual musical instrument system that can be used by a collaborative group of users to emulate playing percussive music. An optical multi-touch tabletop serves as the input device for multiple users, and an algorithmic pipeline interprets users' interactions with this touch-sensing table and provides control signals to activate the coupled physics-based sound simulation system. The musical tunes can be modulated by our numerical acoustic simulator to create believable acoustic effects generated due to cavity in instruments such as drums. It further allows the users to change the materials, shapes, and sizes of the instruments, thereby offering the capability for both rapid prototyping and active exploration of sound effects by altering various physical parameters. We discuss some of key design principles and what such a system can offer.