Integration and control of reactive visual processes
ECCV '94 Proceedings of the third European conference on Computer Vision (Vol. II)
The construction of experience: interface as content
Digital illusion
Composing Interactive Music: Techniques and Ideas Using Max
Composing Interactive Music: Techniques and Ideas Using Max
Segmentation and Tracking Using Color Mixture Models
ACCV '98 Proceedings of the Third Asian Conference on Computer Vision-Volume I - Volume I
The Acquisition and Use of Interaction Behavior Models
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Locating Facial Region of a Head-and-Shoulders Color Image
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Personal tele-embodiment
Mixtures of Eigenfeatures for Real-Time Structure from Texture
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
AtoMIC Pro: a multiple sensor acquisition device
NIME '02 Proceedings of the 2002 conference on New interfaces for musical expression
EyesWeb: Toward Gesture and Affect Recognition in Interactive Dance and Music Systems
Computer Music Journal
Schwelle: sensor augmented, adaptive sound design for live theatrical performance
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
Grasping Interface with Photo Sensor for a Musical Instrument
Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques
Hi-index | 0.00 |
This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances.From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context.Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed.