Visualizing emotion in musical performance using a virtual character

  • Authors:
  • Robyn Taylor;Pierre Boulanger;Daniel Torres

  • Affiliations:
  • Advanced Man-Machine Interface Laboratory, Department of Computing Science, University of Alberta, Edmonton, Alberta, Canada;Advanced Man-Machine Interface Laboratory, Department of Computing Science, University of Alberta, Edmonton, Alberta, Canada;Advanced Man-Machine Interface Laboratory, Department of Computing Science, University of Alberta, Edmonton, Alberta, Canada

  • Venue:
  • SG'05 Proceedings of the 5th international conference on Smart Graphics
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe an immersive music visualization application which enables interaction between a live musician and a responsive virtual character. The character reacts to live performance in such a way that it appears to be experiencing an emotional response to the music it ‘hears.' We modify an existing tonal music encoding strategy in order to define how the character perceives and organizes musical information. We reference existing research correlating musical structures and composers' emotional intention in order to simulate cognitive processes capable of inferring emotional meaning from music. The ANIMUS framework is used to define a synthetic character who visualizes its perception and cognition of musical input by exhibiting responsive behaviour expressed through animation.