International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
pDM: An Expressive Sequencer with Real-Time Control of the KTH Music-Performance Rules
Computer Music Journal
Body music: physical exploration of music theory
Sandbox '08 Proceedings of the 2008 ACM SIGGRAPH symposium on Video games
Evaluation of User's Physical Experience in Full Body Interactive Games
HAID '09 Proceedings of the 4th International Conference on Haptic and Audio Interaction Design
Mood swings: design and evaluation of affective interactive art
The New Review of Hypermedia and Multimedia - Special issue on experience design - applications and reflections
The SEMAINE API: towards a standards-based framework for building emotion-oriented systems
Advances in Human-Computer Interaction - Special issue on emotion-aware natural interaction
Showing emotions through movement and symmetry
Computers in Human Behavior
Eye. breathe. music: creating music through minimal movement
EVA'10 Proceedings of the 2010 international conference on Electronic Visualisation and the Arts
"the approval of the franciscan rule": virtual experience among the characters of Giotto's Work
VAST'10 Proceedings of the 11th International conference on Virtual Reality, Archaeology and Cultural Heritage
International Journal of Human-Computer Studies
Hi-index | 0.00 |
In this paper we describe a system which allows users to use their full-body for controlling in real-time the generation of an expressive audio-visual feedback. The system extracts expressive motion features from the user's full-body movements and gestures. The values of these motion features are mapped both onto acoustic parameters for the real-time expressive rendering of a piece of music, and onto real-time generated visual feedback projected on a screen in front of the user.