Supporting interoperability and presence awareness in collaborative mixed reality environments

  • Authors:
  • Oyewole Oyekoya;Ran Stone;William Steptoe;Laith Alkurdi;Stefan Klare;Angelika Peer;Tim Weyrich;Benjamin Cohen;Franco Tecchia;Anthony Steed

  • Affiliations:
  • University College London;IBM Research, Haifa Israel;Technische Universitat Munchen, Germany;Technische Universitat Munchen, Germany;Technische Universitat Munchen, Germany;Technische Universitat Munchen, Germany;University College London;IBM Research, Haifa Israel;Scuola Superiore Sant'Anna, Pisa Italy;University College London

  • Venue:
  • Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the BEAMING project we have been extending the scope of collaborative mixed reality to include the representation of users in multiple modalities, including augmented reality, situated displays and robots. A single user (a visitor) uses a high-end virtual reality system (the transporter) to be virtually teleported to a real remote location (the destination). The visitor may be tracked in several ways including emotion and motion capture. We reconstruct the destination and the people within it (the locals). In achieving this scenario, BEAMING has integrated many heterogeneous systems. In this paper, we describe the design and key implementation choices in the Beaming Scene Service (BSS), which allows the various processes to coordinate their behaviour. The core of the system is a light-weight shared object repository that allows loose coupling between processes with very different requirements (e.g. embedded control systems through to mobile apps). The system was also extended to support the notion of presence awareness. We demonstrate two complex applications built with the BSS.