Pfinder: Real-Time Tracking of the Human Body
IEEE Transactions on Pattern Analysis and Machine Intelligence
Facing the music: a facial action controlled musical interface
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Contour Tracking by Stochastic Propagation of Conditional Density
ECCV '96 Proceedings of the 4th European Conference on Computer Vision-Volume I - Volume I
Motion segmentation and pose recognition with motion history gradients
Machine Vision and Applications - Special issue: IEEE WACV
A Mixed-State Condensation Tracker with Automatic Model-Switching
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
Designing, playing, and performing with a vision-based mouth interface
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
Head-tracking for gestural and continuous control of parameterized audio effects
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
Conducting Digitally Stored Music by Computer Vision Tracking
AXMEDIS '05 Proceedings of the First International Conference on Automated Production of Cross Media Content for Multi-Channel Distribution
EyesWeb: Toward Gesture and Affect Recognition in Interactive Dance and Music Systems
Computer Music Journal
Toward a Framework for Interactive Systems to Conduct Digital Audio and Video Streams
Computer Music Journal
VIM: Vision for Interactive Music
WACV '07 Proceedings of the Eighth IEEE Workshop on Applications of Computer Vision
Expressive control of music and visual media by full-body movement
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
A non-contact mouse for surgeon-computer interaction
Technology and Health Care
Gesture interaction for electronic music performance
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
Multimodal human computer interaction: a survey
ICCV'05 Proceedings of the 2005 international conference on Computer Vision in Human-Computer Interaction
Hi-index | 0.00 |
This paper proposes a new perceptual interface for the control of computer-based music production. We address the constraints imposed by the use of musical meta-instruments during live performance or rehearsal by tracking feet motion relatively to a visual keyboard. The visual attribute stands for the fact that, unlike its physical counterpart, our keyboard does not involve any force feedback during key-presses. The proposed tracking algorithm is structured on two levels, namely a coarse level for foot regions, and a fine level for foot tips. Tracking works in real-time and handles efficiently feet regions merging/unmerging due to spatial proximity and cast shadows. The output of the tracking is used for the spatiotemporal detection of key-''press'' events.