Music, cognition, and computerized sound
Mouthbrush: drawing and painting by hand and mouth
Proceedings of the 5th international conference on Multimodal interfaces
Tongue 'n' Groove: an ultrasound based music controller
NIME '02 Proceedings of the 2002 conference on New interfaces for musical expression
Designing, playing, and performing with a vision-based mouth interface
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
Bimanuality in alternate musical instruments
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
Head-tracking for gestural and continuous control of parameterized audio effects
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
A novel face-tracking mouth controller and its application to interacting with bioacoustic models
NIME '04 Proceedings of the 2004 conference on New interfaces for musical expression
Sonification of facial actions for musical expression
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
The Wahwactor: a voice controlled wah-wah pedal
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI
Computer Music Journal
Grasping Interface with Photo Sensor for a Musical Instrument
Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques
Creating new interfaces for musical expression: introduction to NIME
ACM SIGGRAPH 2009 Courses
HCI and the face: towards an art of the soluble
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: interaction design and usability
Balance ball interface for performing arts
HCII'11 Proceedings of the 1st international conference on Human interface and the management of information: interacting with information - Volume Part II
Advances in new interfaces for musical expression
ACM SIGGRAPH 2011 Courses
Advances in new interfaces for musical expression
SIGGRAPH Asia 2012 Courses
Creating new interfaces for musical expression
SIGGRAPH Asia 2013 Courses
Hi-index | 0.00 |
We describe a novel musical controller which acquires live video input from the user's face, extracts facial feature parameters using a computer vision algorithm, and converts these to expressive musical effects. The controller allows the user to modify synthesized or audio-filtered musical sound in real time by moving the face.