Optimized scalable cache management for video streaming system

  • Authors:
  • Yongfeng Li;Kenneth Ong

  • Affiliations:
  • Department of Electrical and Computer Engineering, National University of Singapore, Singapore, Singapore 117576;Department of Electrical and Computer Engineering, National University of Singapore, Singapore, Singapore 117576

  • Venue:
  • Multimedia Tools and Applications
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

An important research issue in video streaming is how to efficiently utilize the network resources to provide clients instant access to multiple video objects. Caching strategy and transmission scheme are the two essential points inside the video streaming framework. Recent research efforts on them are not sufficient due to their inflexible support for scalable encoded video streams and heterogeneous requests from clients. In this paper, we propose an optimized caching strategy (OCS) and a scalable transmission scheme (STS) for scalable coded video streaming. By exploring the characteristics of video streaming workload and system design objectives, OCS and STS work efficiently to minimize both network bandwidth cost and user access latency. Firstly, we analyze the caching problem for the proxy-assisted video streaming system and derive a maneuverable caching scenario. Secondly, we develop an efficient transmission scheme for scalable coded videos. Thirdly, we formulate a multi-objective optimization model with closed-form expressions to obtain the optimized caching strategy. Finally, with designed algorithms, an excellent compromise between two competing objectives (minimizing the bandwidth cost and the access latency) is achieved. We start our evaluation by studying the optimized caching strategy for a single video object. Then we apply the strategy to multiple video objects and illustrate the tradeoff between the optimization objectives. Our evaluation results show that compared with other caching strategies, the proposed optimized scalable caching strategy can achieve a significant reduction in bandwidth cost with even a small proxy cache size. Meanwhile, the best performance (in terms of bandwidth cost) is obtained together with the proposed scalable batch-patching transmission scheme.