The Jam-O-Drum interactive music system: a study in interaction design
DIS '00 Proceedings of the 3rd conference on Designing interactive systems: processes, practices, methods, and techniques
An evaluation of a multiple interface design solution for bloated software
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
New Digital Musical Instruments: Control And Interaction Beyond the Keyboard (Computer Music and Digital Audio Series)
Siftables: towards sensor network user interfaces
Proceedings of the 1st international conference on Tangible and embedded interaction
The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces
Proceedings of the 1st international conference on Tangible and embedded interaction
Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI
Computer Music Journal
Embodied Music Cognition and Mediation Technology
Embodied Music Cognition and Mediation Technology
A music application for visually impaired people using daily goods and stationeries on the table
Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
De-Mo: designing action-sound relationships with the mo interfaces
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Gesture-based control of physical modeling sound synthesis: a mapping-by-demonstration approach
Proceedings of the 21st ACM international conference on Multimedia
Collaborative music application for visually impaired people with tangible objects on table
Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
Hi-index | 0.00 |
We present an ensemble of tangible objects and software modules designed for musical interaction and performance. The tangible interfaces form an ensemble of connected objects communicating wirelessly. A central concept is to let users determine the final musical function of the objects, favoring customization, assembling, repurposing. This might imply assembling the wireless interfaces with existing everyday objects or musical instruments. Moreover, gesture analysis and recognition modules allow the users to define their own action/motion for the control of sound parameters. Various sound engines and interaction scenarios were built and experimented. Some examples that were developed in a music pedagogy context are described.