International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Synthesizing multimodal utterances for conversational agents: Research Articles
Computer Animation and Virtual Worlds
Embodied Music Cognition and Mediation Technology
Embodied Music Cognition and Mediation Technology
Gesture analysis of violin bow strokes
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
Playing “air instruments”: mimicry of sound-producing gestures by novices and experts
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
Analyzing sound tracings: a multimodal approach to music information retrieval
MIRUM '11 Proceedings of the 1st international ACM workshop on Music information retrieval with user-centered and multimodal strategies
A statistical approach to analyzing sound tracings
CMMR'11 Proceedings of the 8th international conference on Speech, Sound and Music Processing: embracing research in India
Effects of spectral features of sound on gesture type and timing
GW'11 Proceedings of the 9th international conference on Gesture and Sign Language in Human-Computer Interaction and Embodied Communication
Analyzing correspondence between sound objects and body motion
ACM Transactions on Applied Perception (TAP)
Context-aware gesture recognition in classical music conducting
Proceedings of the 21st ACM international conference on Multimedia
Hi-index | 0.00 |
This article reports on the exploration of a method based on canonical correlation analysis (CCA) for the analysis of the relationship between gesture and sound in the context of music performance and listening. This method is a first step in the design of an analysis tool for gesture-sound relationships. In this exploration we used motion capture data recorded from subjects performing free hand movements while listening to short sound examples. We assume that even though the relationship between gesture and sound might be more complex, at least part of it can be revealed and quantified by linear multivariate regression applied to the motion capture data and audio descriptors extracted from the sound examples. After outlining the theoretical background, the article shows how the method allows for pertinent reasoning about the relationship between gesture and sound by analysing the data sets recorded from multiple and individual subjects.