OMax brothers: a dynamic yopology of agents for improvization learning

  • Authors:
  • Gérard Assayag;Georges Bloch;Marc Chemillier;Arshia Cont;Shlomo Dubnov

  • Affiliations:
  • IRCAM-CNRS UMR Stms, Paris, France;University of Strasbourg, Strasbourg France;University of Caen, Caen Cedex France;IRCAM-UCSD, Paris, France;UCSD, La Jolla, CA

  • Venue:
  • Proceedings of the 1st ACM workshop on Audio and music computing multimedia
  • Year:
  • 2006

Quantified Score

Hi-index 0.02

Visualization

Abstract

We describe a multi-agent architecture for an improvization oriented musician-machine interaction system that learns in real time from human performers. The improvization kernel is based on sequence modeling and statistical learning. The working system involves a hybrid architecture using two popular composition/perfomance environments, Max and OpenMusic, that are put to work and communicate together, each one handling the process at a different time/memory scale. The system is capable of processing real-time audio/video as well as MIDI. After discussing the general cognitive background of improvization practices, the statistical modeling tools and the concurrent agent architecture are presented. Finally, a prospective Reinforcement Learning scheme for enhancing the system's realism is described.