Assessing internet video quality using crowdsourcing

  • Authors:
  • Óscar Figuerola Salas;Velibor Adzic;Akash Shah;Hari Kalva

  • Affiliations:
  • Florida Atlantic University, Boca Raton, FL, USA;Florida Atlantic University, Boca Raton, FL, USA;Nirma University Institute of Technology, Ahmedabad, India;Florida Atlantic University, Boca Raton, FL, USA

  • Venue:
  • Proceedings of the 2nd ACM international workshop on Crowdsourcing for multimedia
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we present a subjective video quality evaluation system that has been integrated with different crowdsourcing platforms. We try to evaluate the feasibility of replacing the time consuming and expensive traditional tests with a faster and less expensive crowdsourcing alternative. CrowdFlower and Amazon's Mechanical Turk were used as the crowdsourcing platforms to collect data. The data was compared with the formal subjective tests conducted by MPEG as part of the video standardization process, as well as with previous results from a study we ran at the university level. High quality compressed videos with known Mean Opinion Score (MOS) are used as references instead of the original lossless videos in order to overcome intrinsic bandwidth limitations. The bitrates chosen for the experiment were selected targeting Internet use, since this is the environment in which users were going to be evaluating the videos. Evaluations showed that the results are consistent with formal subjective evaluation scores, and can be reproduced across different crowds with low variability, which makes this type of test setting very promising.