Robust and scalable transmission of arbitrary 3D models over wireless networks

  • Authors:
  • Irene Cheng;Lihang Ying;Kostas Daniilidis;Anup Basu

  • Affiliations:
  • Department of Computer and Information Science, University of Pennsylvania, Philadelphia, PA and Department of Computing Science, University of Alberta, Edmonton, AB, Canada;Department of Computing Science, University of Alberta, Edmonton, AB, Canada;Department of Computer and Information Science, University of Pennsylvania, Philadelphia, PA;Department of Computing Science, University of Alberta, Edmonton, AB, Canada

  • Venue:
  • Journal on Image and Video Processing - 3D Image and Video Processing
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe transmission of 3D objects represented by texture and mesh over unreliable networks, extending our earlier work for regular mesh structure to arbitrary meshes and considering linear versus cubic interpolation. Our approach to arbitrary meshes considers stripification of the mesh and distributing nearby vertices into different packets, combined with a strategy that does not need texture or mesh packets to be retransmitted. Only the valence (connectivity) packets need to be retransmitted; however, storage of valence information requires only 10% space compared to vertices and even less compared to photorealistic texture. Thus, less than 5% of the packets may need to be retransmitted in the worst case to allow our algorithm to successfully reconstruct an acceptable object under severe packet loss. Even though packet loss during transmission has received limited research attention in the past, this topic is important for improving quality under lossy conditions created by shadowing and interference. Results showing the implementation of the proposed approach using linear, cubic, and Laplacian interpolation are described, and the mesh reconstruction strategy is compared with other methods.