A computer participant in musical improvisation
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
BoB: an interactive improvisational music companion
AGENTS '00 Proceedings of the fourth international conference on Autonomous agents
Factor Oracle: A New Structure for Pattern Matching
SOFSEM '99 Proceedings of the 26th Conference on Current Trends in Theory and Practice of Informatics on Theory and Practice of Informatics
Interactive Improvisational Music Companionship: A User-Modeling Approach
User Modeling and User-Adapted Interaction
A Hybrid Architectural Style for Distributed Parallel Processing of Generic Data Streams
Proceedings of the 26th International Conference on Software Engineering
Using Factor Oracles for Machine Improvisation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Computer-Assisted Composition at IRCAM: From PatchWork to OpenMusic
Computer Music Journal
An architectural framework for interactive music systems
NIME '06 Proceedings of the 2006 conference on New interfaces for musical expression
OMax brothers: a dynamic yopology of agents for improvization learning
Proceedings of the 1st ACM workshop on Audio and music computing multimedia
Computer Music Journal
Visual feedback in performer-machine interaction for musical improvisation
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
The Design of Everyday Things
Hi-index | 0.00 |
This article describes the design and implementation of the Multimodal Interactive Musical Improvisation (Mimi) system. Unique to Mimi is its visual interface, which provides the performer with instantaneous and continuous information on the state of the system, in contrast to other human-machine improvisation systems, which require performers to grasp and intuit possible extemporizations in response to machine-generated music without forewarning. In Mimi, the information displayed extends into the near future and reaches back into the recent past, allowing the performer awareness of the musical context so as to plan their response accordingly. This article presents the details of Mimi's system design, the visual interface, and its implementation using the formalism defined by François' Software Architecture for Immersipresence (SAI) framework. Mimi is the result of a collaborative iterative design process. We have recorded the design sessions and present here findings from the transcripts that provide evidence for the impact of visual support on improvisation planning and design. The findings demonstrate that Mimi's visual interface offers musicians the opportunity to anticipate and to review decisions, thus making it an ideal performance and pedagogical tool for improvisation. It allows novices to create more contextually relevant improvisations and experts to be more inventive in their extemporizations.