A quest for an Internet video quality-of-experience metric

  • Authors:
  • Athula Balachandran;Vyas Sekar;Aditya Akella;Srinivasan Seshan;Ion Stoica;Hui Zhang

  • Affiliations:
  • Carnegie Mellon University;Stony Brook University;University of Wisconsin Madison;Carnegie Mellon University;University of California Berkeley;Carnegie Mellon University

  • Venue:
  • Proceedings of the 11th ACM Workshop on Hot Topics in Networks
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

An imminent challenge that content providers, CDNs, third-party analytics and optimization services, and video player designers in the Internet video ecosystem face is the lack of a single "gold standard" to evaluate different competing solutions. Existing techniques that describe the quality of the encoded signal or controlled studies to measure opinion scores do not translate directly into user experience at scale. Recent work shows that measurable performance metrics such as buffering, startup time, bitrate, and number of bitrate switches impact user experience. However, converting these observations into a quantitative quality-of-experience metric turns out to be challenging since these metrics are interrelated in complex and sometimes counter-intuitive ways, and their relationship to user experience can be unpredictable. To further complicate things, many confounding factors are introduced by the nature of the content itself (e.g., user interest, genre). We believe that the issue of interdependency can be addressed by casting this as a machine learning problem to build a suitable predictive model from empirical observations. We also show that setting up the problem based on domain-specific and measurement-driven insights can minimize the impact of the various confounding factors to improve the prediction performance.