Streaming 3D shape deformations in collaborative virtual environment

  • Authors:
  • Ziying Tang; Guodong Rong; Xiaohu Guo;B Prabhakaran

  • Affiliations:
  • Comput. Sci. Dept., Univ. of Texas at Dallas, Dallas, TX, USA;Comput. Sci. Dept., Univ. of Texas at Dallas, Dallas, TX, USA;Comput. Sci. Dept., Univ. of Texas at Dallas, Dallas, TX, USA;Comput. Sci. Dept., Univ. of Texas at Dallas, Dallas, TX, USA

  • Venue:
  • VR '10 Proceedings of the 2010 IEEE Virtual Reality Conference
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Collaborative virtual environment has been limited on static or rigid 3D models, due to the difficulties of real-time streaming of large amounts of data that is required to describe motions of 3D deformable models. Streaming shape deformations of complex 3D models arising from a remote user's manipulations is a challenging task. In this paper, we present a framework based on spectral transformation that encodes surface deformations in a frequency format to successfully meet the challenge, and demonstrate its use in a distributed virtual environment. Our research contributions through this framework include: i) we reduce the data size to be streamed for surface deformations since we stream only the transformed spectral coefficients and not the deformed model; ii) we propose a mapping method to allow models with multi-resolutions to have the same deformations simultaneously; iii) our streaming strategy can tolerate loss without the need for special handling of packet loss. Our system guarantees real-time transmission of shape deformations and ensures the smooth motions of 3D models. Moreover, we achieve very effective performance over real Internet conditions as well as a local LAN. Experimental results show that we get low distortion and small delays even when surface deformations of large and complicated 3D models are streamed over lossy networks.