Buffering and caching in large-scale video servers

  • Authors:
  • A. Dan;D. M. Dias;R. Mukherjee;D. Sitaram;R. Tewari

  • Affiliations:
  • -;-;-;-;-

  • Venue:
  • COMPCON '95 Proceedings of the 40th IEEE Computer Society International Conference
  • Year:
  • 1995

Quantified Score

Hi-index 0.00

Visualization

Abstract

Video-on-demand servers are characterized by stringent real-time constraints, as each stream requires isochronous data playout. The capacity of the system depends on the acceptable jitter per stream (the number of data blocks that do not meet their real-time constraints). Per-stream read-ahead buffering avoids the disruption in playback caused by variations in disk access time and queuing delays. With heavily skewed access patterns to the stored video data, the system is often disk arm-bound. In such cases, serving video streams from a memory cache can result in a substantial reduction in server cost. In this paper, we study the cost-performance trade-offs of various buffering and caching strategies that can be used in a large-scale video server. We first study the cost impact of varying the buffer size, disk utilization and the disk characteristics on the overall capacity of the system. Subsequently, we study the cost-effectiveness of a technique for memory caching across streams that exploits temporal locality and workload fluctuations.