Mixage mobile

  • Authors:
  • Olivier Delerue;Olivier Warusfel

  • Affiliations:
  • IRCAM, Paris, France;IRCAM, Paris, France

  • Venue:
  • IHM '06 Proceedings of the 18th International Conferenceof the Association Francophone d'Interaction Homme-Machine
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper takes two examples of Audio Augmented Realities and Wave Field Synthesis to show evidence of some evolution in the task of a sound engineer: his task which is commonly known to be static becomes necessarily mobile in these particular contexts.We first comment and describe these new tasks and show evidence of the needs for evolution of the corresponding tools : we show why these mixing situations can not be properly handled without an appropriate design tool that allows controlling the virtual scene while walking through the rendering space.We then present our proposition for such a context through two software prototypes : first, the "Listen-Space" application, an authoring tool specially designed for the particular case of audio augmented realities and that allows performing mobile control of spatialization when run on a wireless ultra-portable tablet PC. Second, we describe the "L-Scanner", a "see-through" interface implementation dedicated to the control of sound spatialization.We finally discuss applications of our system such as collaborative mixing, thanks to a distributed software architecture as well as the split between the graphic user control interface and the DSP audio rendering engine: in this context, each participant can work on a shared virtual sound scene while adapting its representation at its personal taste and needs.