Using graphics rendering contexts to enhance the real-time video coding for mobile cloud gaming

  • Authors:
  • Shu Shi;Cheng-Hsin Hsu;Klara Nahrstedt;Roy Campbell

  • Affiliations:
  • University of Illinois at Urbana-Champaign, Urbana, IL, USA;National Tsing Hua University, Hsinchu, Taiwan Roc;University of Illinois at Urbana-Champaign, Urbana, IL, USA;University of Illinois at Urbana-Champaign, Urbana, IL, USA

  • Venue:
  • MM '11 Proceedings of the 19th ACM international conference on Multimedia
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The emerging cloud gaming service has been growing rapidly, but not yet able to reach mobile customers due to many limitations, such as bandwidth and latency. We introduce a 3D image warping assisted real-time video coding method that can potentially meet all the requirements of mobile cloud gaming. The proposed video encoder selects a set of key frames in the video sequence, uses the 3D image warping algorithm to interpolate other non-key frames, and encodes the key frames and the residues frames with an H.264/AVC encoder. Our approach is novel in taking advantage of the run-time graphics rendering contexts (rendering viewpoint, pixel depth, camera motion, etc.) from the 3D game engine to enhance the performance of video encoding for the cloud gaming service. The experiments indicate that our proposed video encoder has the potential to beat the state-of-art x264 encoder in the scenario of real-time cloud gaming. For example, by implementing the proposed method in a 3D tank battle game, we experimentally show that more than 2 dB quality improvement is possible.