Building low-latency remote rendering systems for interactive 3D graphics rendering on mobile devices

  • Authors:
  • Shu Shi

  • Affiliations:
  • University of Illinois at Urbana-Champaign, Urbana, IL, USA

  • Venue:
  • MM '11 Proceedings of the 19th ACM international conference on Multimedia
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The recent explosion of mobile devices is changing people's computing behaviors and more and more applications are ported to mobile platforms. However, some applications, such as 3D video tele-immersion and 3D video gaming that require intensive computation or network bandwidth are not capable of running on mobile devices yet. Remote rendering is a simple but effective solution. A workstation with enough computation and network bandwidth resources (e.g., cloud server) is served as the rendering server. It receives and renders the source contents (e.g., 3D graphics or 3D video), and sends the rendering results (2D images) to one or multiple clients. The client simply receives and displays the result images. Using remote rendering for 3D game or 3D video rendering on mobile devices can solve both computation and bandwidth problems. However, there are some issues with remote rendering 3D video/3D games for mobile devices: (1) Interaction Latency: The interaction latency is defined as the time from the generation of user interaction request till the appearance of the first updated rendering frame on the mobile client. In our application scenario, the interaction latency is mainly determined by the round trip time of wireless networks. (2) Bandwidth: Since we are considering the mobile devices using wireless mobile networks as the client in our remote rendering system, the available network bandwidth is very limited compared with the wired networks. The system design should consider the bandwidth limitation and try to minimize the network bandwidth usage. (3) Real Time: There are real-time requirements in our system design. The rendering rate of 3D video is determined by the recording 3D camera and the frame rate of 3D gaming depends on the game motion. The processing operations of every frame should be completed before the start of next frame. My researches propose novel remote rendering designs to enhance the interactive experience of 3D graphics rendering on mobile devices by reducing the interaction latency. Different from conventional approaches, the proposed system applies no restriction on the network latency to provide low latency rendering services. Instead, the rendering server sends an image-based representation of the current 3D scene/model to the client. Once any viewpoint change interaction happens, the mobile client can directly synthesize the appropriate image using image-based rendering techniques. The research problems can be summarized as the follows: What image-based representation of the 3D scene/model should the rendering server generate in order to reduce interaction latency? Given the real-time requirement, how does the rendering server generate the image-based representation of every frame efficiently? What encoding scheme can be applied to compress the generated image-based representation to meet the limited wireless bandwidth requirement? How can the remote rendering system with latency reduction enhancements be evaluated effectively? This research started in 2009 and progresses of all topics have been made. For problem (1), our full paper in MM'091 proposed to generate two depth images as the image-based representation and use 3D image warping algorithm to synthesize the view at the new rendering viewpoint on the mobile client. For problem (2), our full paper in MM'10 introduced several computation efficient algorithms in how to select reference frames for 3D image warping. For problem (3), we have studied a novel video coding method by integrating H.264/AVC together with 3D image warping algorithm and this new coding method can potentially beat the state of art x264 in terms of compression efficiency in the scenario of real-time 3D game video encoding. The full paper with the latest research results has been accepted by MM'11. For the last problem, we developed a new metric: DOL (Distortion Over Latency) to evaluate the interactive performance of remote rendering systems by combining both latency and rendering quality in one score and the paper has been presented in ICME'11.