Modular musical objects towards embodied control of digital music
Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction
A multimodal probabilistic model for gesture--based control of sound synthesis
Proceedings of the 21st ACM international conference on Multimedia
Hi-index | 0.00 |
We address the issue of mapping between gesture and sound for gesture-based control of physical modeling sound synthesis. We propose an approach called mapping by demonstration, allowing users to design the mapping by performing gestures while listening to sound examples. The system is based on a multimodal model able to learn the relationships between gestures and sounds.