The media equation: how people treat computers, television, and new media like real people and places
Designing Sociable Robots
Facing the music: a facial action controlled musical interface
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Real-Time, Fully Automatic Upper Facial Feature Tracking
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
The Electronic Sitar controller
NIME '04 Proceedings of the 2004 conference on New interfaces for musical expression
A novel face-tracking mouth controller and its application to interacting with bioacoustic models
NIME '04 Proceedings of the 2004 conference on New interfaces for musical expression
Sonification of facial actions for musical expression
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Bangarama: creating music with headbanging
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Creating new interfaces for musical expression: introduction to NIME
ACM SIGGRAPH 2009 Courses
Advances in new interfaces for musical expression
ACM SIGGRAPH 2011 Courses
Advances in new interfaces for musical expression
SIGGRAPH Asia 2012 Courses
Creating new interfaces for musical expression
SIGGRAPH Asia 2013 Courses
Hi-index | 0.00 |
This paper describes a system which uses the output from head-tracking and gesture recognition software to drive a parameterized guitar effects synthesizer in real-time.