A generic quantitative relationship between quality of experience and quality of service
IEEE Network: The Magazine of Global Internetworking - Special issue on improving quality of experience for network services
Quality management on Amazon Mechanical Turk
Proceedings of the ACM SIGKDD Workshop on Human Computation
Measuring the perceptual quality of Skype sources
ACM SIGCOMM Computer Communication Review - Special october issue SIGCOMM '12
Modeling the qoe of rate changes in SKYPE/SILK VoIP calls
Proceedings of the 20th ACM international conference on Multimedia
Developing a predictive model of quality of experience for internet video
Proceedings of the ACM SIGCOMM 2013 conference on SIGCOMM
Hi-index | 0.00 |
To deliver voice over the Internet in a cost-effective way, it is essential to quantify the quality of user experience (i.e., QoE) of a voice service at various provisioning levels. Conducting user studies is an inevitable step facilitating quantitative studies of QoE. The two experimental methods -- lab experiment vs. crowdsourcing via Amazon Mechanical Turk [1] -- are compared in this study. We find that, for the study of Skype call quality, the crowdsourcing approach stands out in terms of efficiency and user diversity, which in turn strengthens the robustness and the depth of the analysis.