A framework to manage multimodal fusion of events for advanced interactions within virtual environments

  • Authors:
  • Damien Touraine;Patrick Bourdot;Yacine Bellik;Laurence Bolot

  • Affiliations:
  • University Paris XI Bât. 508, BP 133, F-91403 Orsay cedex (France);University Paris XI Bât. 508, BP 133, F-91403 Orsay cedex (France);University Paris XI Bât. 508, BP 133, F-91403 Orsay cedex (France);University Paris XI Bât. 508, BP 133, F-91403 Orsay cedex (France)

  • Venue:
  • EGVE '02 Proceedings of the workshop on Virtual environments 2002
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes the EVI3d framework, a distributed architecture developed to enhance interactions within Virtual Environments (VE). This framework manages many multi-sensorial devices such as trackers, data gloves, and speech or gesture recognition systems as well as haptic devices. The structure of this architecture allows a complete dispatching of device services and their clients on as many machines as required. With the dated events provided by its time synchronization system, it becomes possible to design a specific module to manage multimodal fusion processes. To this end, we describe how the EVI3d framework manages not only low-level events but also abstract modalities. Moreover, the data flow service of the EVI3d framework solves the problem of sharing the virtual scene between modality modules.