Collaborative view synthesis for interactive multi-view video streaming

  • Authors:
  • Fei Chen;Jiangchuan Liu;Edith Cheuk-Han Ngai;Yuan Zhao

  • Affiliations:
  • Simon Fraser University, Burnaby, BC, Canada;Simon Fraser University, Burnaby, BC, Canada;Uppsala University, Uppsala, Sweden;Simon Fraser University, Burnaby, BC, Canada

  • Venue:
  • Proceedings of the 22nd international workshop on Network and Operating System Support for Digital Audio and Video
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Interactive multi-view video enables users to enjoy the video from different viewpoints. Yet multi-view dramatically increases the video data volume and their computation, making realtime transmission and interactions a challenging task. It therefore calls for efficient view synthesis strategies that flexibly generate visual views. In this paper, we present a collaborative view synthesis strategy for online interactive multi-view video streaming based on Depth-Image Based Rendering (DIBR) view synthesis technology, which generates a visual view with the texture and depth information on both sides. Different from the traditional DIBR algorithm for single view synthesis, we explore the collaboration relationship between different viewpoints synthesis for a range of visual views generation, and propose Shift DIBR (S-DIBR). In S-DIBR, only the projected pixels, rather than all the pixels of the reference view, are utilized for next visual view generation. Therefore, the computation complexity of projection transform, which is the most computation intensive process in the traditional DIBR algorithm, is reduced to fulfill the requirement of online interactive streaming. Experiment results validate the efficiency of our collaborative view synthesis strategy, as well as the bandwidth scalability of the streaming system.