Emotional remapping of music to facial animation

  • Authors:
  • Steve DiPaola;Ali Arya

  • Affiliations:
  • Simon Fraser University;Carleton University

  • Venue:
  • Proceedings of the 2006 ACM SIGGRAPH symposium on Videogames
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a method to extract the emotional data from a piece of music and then use that data via a remapping algorithm to automatically animate an emotional 3D face sequence. The method is based on studies of the emotional aspect of music and our parametric-based behavioral head model for face animation. We address the issue of affective communication remapping in general, i.e. translation of affective content (eg. emotions, and mood) from one communication form to another. We report on the results of our MusicFace system, which use these techniques to automatically create emotional facial animations from multi-instrument polyphonic music scores in MIDI format and a remapping rule set.