Robust evaluation for quality of experience in crowdsourcing

  • Authors:
  • Qianqian Xu;Jiechao Xiong;Qingming Huang;Yuan Yao

  • Affiliations:
  • BICMR, Peking University & University of Chinese Academy of Sciences, Beijing, China;School of Mathematical Sciences & BICMR, Peking University, Beijing, China;University of Chinese Academy of Sciences & Institute of Computing Technology of Chinese Academy of Sciences, Beijing, China;School of Mathematical Sciences, LMAM-LMEQF-LMP, Peking University, Beijing, China

  • Venue:
  • Proceedings of the 21st ACM international conference on Multimedia
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Strategies exploiting crowdsourcing are increasingly being applied in the area of Quality of Experience (QoE) for multimedia. They enable researchers to conduct experiments with a more diverse set of participants and at a lower economic cost than conventional laboratory studies. However, a major challenge for crowdsourcing tests is the detection and control of outliers, which may arise due to different test conditions, human errors or abnormal variations in context. For this purpose, it is desired to develop a robust evaluation methodology to deal with crowdsourceable data, which are possibly incomplete, imbalanced, and distributed on a graph. In this paper, we propose a robust rating scheme based on robust regression and Hodge Decomposition on graphs, to assess QoE using crowdsourcing. The scheme shows that the removal of outliers in crowdsourcing experiments would be helpful for purifying data and could provide us with more reliable results. The effectiveness of the proposed scheme is further confirmed by experimental studies on both simulated examples and real-world data.