Facing the music: a facial action controlled musical interface

  • Authors:
  • Michael J. Lyons;Nobuji Tetsutani

  • Affiliations:
  • ATR Media Integration and Communication Research Laboratories, Soraku-gun, Kyoto, Japan;ATR Media Integration and Communication Research Laboratories, Soraku-gun, Kyoto, Japan

  • Venue:
  • CHI '01 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2001
  • Haptics

    Music, cognition, and computerized sound

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a novel musical controller which acquires live video input from the user's face, extracts facial feature parameters using a computer vision algorithm, and converts these to expressive musical effects. The controller allows the user to modify synthesized or audio-filtered musical sound in real time by moving the face.