The Singing Tree:: design of an interactive musical interface
DIS '97 Proceedings of the 2nd conference on Designing interactive systems: processes, practices, methods, and techniques
VRPN: a device-independent, network-transparent VR peripheral system
VRST '01 Proceedings of the ACM symposium on Virtual reality software and technology
2 performances in the 21st century virtual color organ
C&C '02 Proceedings of the 4th conference on Creativity & cognition
In-situ speech visualization in real-time interactive installation and performance
Proceedings of the 3rd international symposium on Non-photorealistic animation and rendering
The ANIMUS project: a framework for the creation of interactive creatures in immersed environments
Proceedings of the ACM symposium on Virtual reality software and technology
Using music to interact with a virtual character
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Brain Indices of Music Processing: "Nonmusicians" are Musical
Journal of Cognitive Neuroscience
Processing Syntactic Relations in Language and Music: An Event-Related Potential Study
Journal of Cognitive Neuroscience
A perception and selective attention system for synthetic creatures
SG'03 Proceedings of the 3rd international conference on Smart graphics
Virtual rap dancer: invitation to dance
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Towards Affective-Psychophysiological Foundations for Music Production
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
dream.Medusa: A Participatory Performance
SG '08 Proceedings of the 9th international symposium on Smart Graphics
Interacting with a virtual rap dancer
INTETAIN'05 Proceedings of the First international conference on Intelligent Technologies for Interactive Entertainment
Hi-index | 0.00 |
We describe an immersive music visualization application which enables interaction between a live musician and a responsive virtual character. The character reacts to live performance in such a way that it appears to be experiencing an emotional response to the music it ‘hears.' We modify an existing tonal music encoding strategy in order to define how the character perceives and organizes musical information. We reference existing research correlating musical structures and composers' emotional intention in order to simulate cognitive processes capable of inferring emotional meaning from music. The ANIMUS framework is used to define a synthetic character who visualizes its perception and cognition of musical input by exhibiting responsive behaviour expressed through animation.