Interdisciplinary applications of new instruments
NIME '02 Proceedings of the 2002 conference on New interfaces for musical expression
Sonigraphical instruments: from FMOL to the reacTable
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
EyesWeb: Toward Gesture and Affect Recognition in Interactive Dance and Music Systems
Computer Music Journal
Towards a virtual assistant for performers and stage directors
NIME '06 Proceedings of the 2006 conference on New interfaces for musical expression
Developing multimodal interactive systems with EyesWeb XMI
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems
Design of a digital forensics image mining system
KES'05 Proceedings of the 9th international conference on Knowledge-Based Intelligent Information and Engineering Systems - Volume Part III
Hi-index | 0.00 |
The EyesWeb project concerns the development of a system for real-time analysis of full-body movement and gesture of one or more humans, with a particular focus to affect or emotion content. Such information is used to control and generate sound, music, visual media, and to control actuators (e.g., robots). A main goal is to explore extensions of music language toward gesture and visual languages. The paper describes the state of the art of the project, the design of the current EyesWeb system modules (hardware and software), their experimenting in public events, and discusses ongoing developments.