The importance of parameter mapping in electronic instrument design
NIME '02 Proceedings of the 2002 conference on New interfaces for musical expression
Bimanuality in alternate musical instruments
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
StickMusic: using haptic feedback with a phase vocoder
NIME '04 Proceedings of the 2004 conference on New interfaces for musical expression
Towards a catalog and software library of mapping methods
NIME '06 Proceedings of the 2006 conference on New interfaces for musical expression
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gesture control of sounds in 3D space
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
A unified toolkit for accessing human interface devices in pure data and Max/MSP
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
Hi-index | 0.00 |
The [hid] toolkit is a set of software objects for designing computer-based gestural instruments. All too frequently, computer-based performers are tied to the keyboard-mousemonitor model, narrowly constraining the range of possible gestures. A multitude of gestural input devices are readily available, making it easy to utilize a broader range of gestures. Human Interface Devices (HIDs) such as joysticks, tablets, and gamepads are cheap and can be good musical controllers. Some even provide haptic feedback. The [hid] toolkit provides a unified, consistent framework for getting gestural data from these devices, controlling the feedback, and mapping this data to the desired output. The [hid] toolkit is built in Pd, which provides an ideal platform for this work. combining the ability to synthesize and control audio and video. The addition of easy access to gestural data allows for rapid prototypes. A usable environment also makes computer music instrument design accessible to novices.