Sonification of facial actions for musical expression

  • Authors:
  • Mathias Funk;Kazuhiro Kuwabara;Michael J. Lyons

  • Affiliations:
  • ATR Intelligent Robotics & Communication Labs, Keihanna Science City, Kyoto, Japan;ATR Intelligent Robotics & Communication Labs, Keihanna Science City, Kyoto, Japan;ATR Intelligent Robotics & Communication Labs, Keihanna Science City, Kyoto, Japan

  • Venue:
  • NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

The central role of the face in social interaction and non-verbal communication suggest we explore facial action as a means of musical expression. This paper presents the design, implementation, and preliminary studies of a novel system utilizing face detection and optic flow algorithms to associate facial movements with sound synthesis in a topographically specific fashion. We report on our experience with various gesture-to-sound mappings and applications, and describe our preliminary experiments at musical performance using the system.