Machine musicianship
Composing Interactive Music: Techniques and Ideas Using Max
Composing Interactive Music: Techniques and Ideas Using Max
ICMAI '02 Proceedings of the Second International Conference on Music and Artificial Intelligence
Sonigraphical instruments: from FMOL to the reacTable
NIME '03 Proceedings of the 2003 conference on New interfaces for musical expression
Implicit relevance feedback in interactive music: issues, challenges, and case studies
IIiX Proceedings of the 1st international conference on Information interaction in context
When Cultures Meet: Modelling Cross-Cultural Knowledge Spaces
Proceedings of the 2008 conference on Information Modelling and Knowledge Bases XIX
From motion to emotion: a wearable system for the multimedia enrichment of a Butoh dace performance
Journal of Mobile Multimedia
Unifying performer and accompaniment
CMMR'05 Proceedings of the Third international conference on Computer Music Modeling and Retrieval
Hi-index | 0.00 |
This paper presents some our recent research on computational models and algorithms for real-time analysis of full-body human movement. The focus here is on techniques to extract in real-time expressive cues relevant to KANSEI and emotional content in human expressive gesture, e.g., in dance and music performances. Expressive gesture can contribute to new perspectives for the design of interactive systems. The EyesWeb open software platform is a main concrete result from our research work. EyesWeb is used in interactive applications, including music and other artistic productions, museum interactive exhibits, therapy and rehabilitation, based on the paradigm of expressive gesture. EyesWeb is freely available from www.eyesweb.org.