Scene synchronization for real-time interaction in distributed mixed reality and virtual reality environments

  • Authors:
  • Felix G. Hamza-Lup;Jannick P. Rolland

  • Affiliations:
  • School of Electrical Engineering and Computer Science, University of Central Florida;School of Electrical Engineering and Computer Science & School of Optics-CREOL, University of Central Florida

  • Venue:
  • Presence: Teleoperators and Virtual Environments - Special issue: Advances in collaborative virtual environments
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Advances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. One of the challenges in networked virtual environments is maintaining a consistent view of the shared state in the presence of inevitable network latency and jitter. A consistent view in a shared scene may significantly increase the sense of presence among participants and facilitate their interactivity. The dynamic shared state is directly affected by the frequency of actions applied on the objects in the scene. Mixed Reality (MR) and Virtual Reality (VR) environments contain several types of action producers including human users, a wide range of electronic motion sensors, and haptic devices. In this paper, we propose a novel criterion for categorization of distributed MR/VR systems and present an adaptive synchronization algorithm for distributed MR/VR collaborative environments. In spite of significant network latency, results show that for low levels of update frequencies the dynamic shared state can be kept consistent at multiple remotely located sites.