Facing the music: a facial action controlled musical interface
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Mouthbrush: drawing and painting by hand and mouth
Proceedings of the 5th international conference on Multimodal interfaces
Designing, playing, and performing with a vision-based mouth interface
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
Daisyphone: support for remote music collaboration
NIME '04 Proceedings of the 2004 conference on New interfaces for musical expression
A novel face-tracking mouth controller and its application to interacting with bioacoustic models
NIME '04 Proceedings of the 2004 conference on New interfaces for musical expression
Sonification of facial actions for musical expression
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Improved position tracking of a 3-D gesture-based musical controller using a Kalman Filter
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
Creating new interfaces for musical expression: introduction to NIME
ACM SIGGRAPH 2009 Courses
Advances in new interfaces for musical expression
ACM SIGGRAPH 2011 Courses
Advances in new interfaces for musical expression
SIGGRAPH Asia 2012 Courses
Creating new interfaces for musical expression
SIGGRAPH Asia 2013 Courses
Hi-index | 0.00 |
Here we propose a novel musical controller which acquires imaging data of the tongue with a two-dimensional medical ultrasound scanner. A computer vision algorithm extracts from the image a discrete tongue shape to control, in real-time, a musical synthesizer and musical effects. We evaluate the mapping space between tongue shape and controller parameters and its expressive characteristics.