Lab experiment vs. crowdsourcing: a comparative user study on Skype call quality

  • Authors:
  • Yu-Chuan Yen;Cing-Yu Chu;Su-Ling Yeh;Hao-Hua Chu;Polly Huang

  • Affiliations:
  • National Taiwan University;New York University - Polytechnics;National Taiwan University;National Taiwan University;National Taiwan University

  • Venue:
  • Proceedings of the 9th Asian Internet Engineering Conference
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

To deliver voice over the Internet in a cost-effective way, it is essential to quantify the quality of user experience (i.e., QoE) of a voice service at various provisioning levels. Conducting user studies is an inevitable step facilitating quantitative studies of QoE. The two experimental methods -- lab experiment vs. crowdsourcing via Amazon Mechanical Turk [1] -- are compared in this study. We find that, for the study of Skype call quality, the crowdsourcing approach stands out in terms of efficiency and user diversity, which in turn strengthens the robustness and the depth of the analysis.