Bricks: laying the foundations for graspable user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Flexible New Technique for Camera Calibration
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiple view geometry in computer visiond
Multiple view geometry in computer visiond
The Augmented Composer Project: The Music Table
ISMAR '03 Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality
Audiopad: a tag-based interface for musical performance
NIME '02 Proceedings of the 2002 conference on New interfaces for musical expression
Tangible acoustic interfaces and their applications for the design of new musical instruments
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
ACM SIGGRAPH 2007 posters
ACM SIGGRAPH 2007 posters
Xenakis: combining tangible interaction with probability-based musical composition
Proceedings of the 2nd international conference on Tangible and embedded interaction
Proceedings of the 2008 ACM symposium on Virtual reality software and technology
Improvised interfaces for real-time musical applications
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
Hi-index | 0.00 |
While mixed reality has inspired the development of many new musical instruments, few approaches explore the potential of mobile setups. We present a new musical interaction concept, called opportunistic music. It allows musicians to recreate a hardware musical controller using any objects of their immediate environment. This approach benefits from the physical attributes of real objects for controlling music. Our prototype is based on a stereo-vision tracking system associated with FSR sensors. It allows musicians to define and to interact with opportunistic tangible widgets. Linking these widgets with sound processes allows the interactive creation of musical pieces, where musicians get inspiration from the surrounding environment.