Flocks, herds and schools: A distributed behavioral model
SIGGRAPH '87 Proceedings of the 14th annual conference on Computer graphics and interactive techniques
MnM: a Max/MSP mapping toolbox
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Study of haptic and visual interaction for sound and music control in the phase project
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
[hid] toolkit: a unified framework for instrument design
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Towards a gesture description interchange format
NIME '06 Proceedings of the 2006 conference on New interfaces for musical expression
Gesture Control of Sound Spatialization for Live Musical Performance
Gesture-Based Human-Computer Interaction and Simulation
Towards an interactive multimedia experience for club music and dance
Proceedings of the 7th International Conference on Advances in Mobile Computing and Multimedia
Perceptual impact of gesture control of spatialization
ACM Transactions on Applied Perception (TAP)
Hi-index | 0.00 |
This paper presents a methodology and a set of tools for gesture control of sources in 3D surround sound. The techniques for rendering acoustic events on multi-speaker or headphone-based surround systems have evolved considerably, making it possible to use them in real-time performances on light equipment. Controlling the placement of sound sources is usually done in idiosyncratic ways and has not yet been fully explored and formalized. This issue is addressed here with the proposition of a methodical approach. The mapping of gestures to source motion is implemented by giving the sources physical object properties and manipulating these characteristics with standard geometrical transforms through hierarchical or emergent relationships.