Using a myokinetic synthesizer to control of virtual instruments

  • Authors:
  • Duk Shin;Atsushi Katayama;Kyoungsik Kim;Hiroyuki Kambara;Makoto Sato;Yasuharu Koike

  • Affiliations:
  • Precision and Intelligence Laboratary, Tokyo Institute of Technology;Department of Intelligence and Systems Science, Tokyo Institute of Technology;Department of Intelligence and Systems Science, Tokyo Institute of Technology;Department of Intelligence and Systems Science, Tokyo Institute of Technology;Precision and Intelligence Laboratary, Tokyo Institute of Technology;Precision and Intelligence Laboratary, Tokyo Institute of Technology

  • Venue:
  • ICAT'06 Proceedings of the 16th international conference on Advances in Artificial Reality and Tele-Existence
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We have been developing a new type of human-computer interface, myokinetic synthesizer (MyoKinSynthesizer), using electromyography (EMG) signals. It enables a user to select virtual instruments and to control its properties such as volume and tone without any position or force sensors. The virtual marimba system emulates the basic properties of the real instrument by producing a sound depending on in which of eight zones and how hard the user is hitting. The virtual drum set is composed of 4 different virtual drums controlled by arms and legs. We used a three-layer neural network to estimate position and force of the forearm from EMG signals. After training the neural network and obtaining appropriate weights, the subject was able to control the movement of virtual avatar and to play virtual instruments. The system was destined to be used as a demonstration of VR entertainment and music therapy rehabilitation.