Virtual reality for palmtop computers
ACM Transactions on Information Systems (TOIS)
MidiSpace: a temporal constraint-based music spatializer
MULTIMEDIA '98 Proceedings of the sixth ACM international conference on Multimedia
Multimedia Systems - Special issue on audio and multimedia
Proceedings of the 15th annual ACM symposium on User interface software and technology
Hand-Held Windows: Towards Effective 2D Interaction in Immersive Virtual Environments
VR '99 Proceedings of the IEEE Virtual Reality
Immersive Audio-Augmented Environments: The LISTEN Project
IV '01 Proceedings of the Fifth International Conference on Information Visualisation
LScanner: une interface de type « peephole » pour le contrôle de la spatialisation
IHM '07 Proceedings of the 19th International Conference of the Association Francophone d'Interaction Homme-Machine
Fluid interaction in audio-guided museum visit: authoring tool and visitor device
VAST'10 Proceedings of the 11th International conference on Virtual Reality, Archaeology and Cultural Heritage
Hi-index | 0.00 |
This paper takes two examples of Audio Augmented Realities and Wave Field Synthesis to show evidence of some evolution in the task of a sound engineer: his task which is commonly known to be static becomes necessarily mobile in these particular contexts.We first comment and describe these new tasks and show evidence of the needs for evolution of the corresponding tools : we show why these mixing situations can not be properly handled without an appropriate design tool that allows controlling the virtual scene while walking through the rendering space.We then present our proposition for such a context through two software prototypes : first, the "Listen-Space" application, an authoring tool specially designed for the particular case of audio augmented realities and that allows performing mobile control of spatialization when run on a wireless ultra-portable tablet PC. Second, we describe the "L-Scanner", a "see-through" interface implementation dedicated to the control of sound spatialization.We finally discuss applications of our system such as collaborative mixing, thanks to a distributed software architecture as well as the split between the graphic user control interface and the DSP audio rendering engine: in this context, each participant can work on a shared virtual sound scene while adapting its representation at its personal taste and needs.