A Survey of Outlier Detection Methodologies
Artificial Intelligence Review
Robust estimation and wavelet thresholding in partially linear models
Statistics and Computing
Crowdsourcing user studies with Mechanical Turk
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A crowdsourceable QoE evaluation framework for multimedia content
MM '09 Proceedings of the 17th ACM international conference on Multimedia
IEEE Transactions on Information Theory
Perceptual visual quality metrics: A survey
Journal of Visual Communication and Image Representation
Random partial paired comparison for subjective video quality assessment via hodgerank
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Online crowdsourcing subjective image quality assessment
Proceedings of the 20th ACM international conference on Multimedia
Hi-index | 0.00 |
Strategies exploiting crowdsourcing are increasingly being applied in the area of Quality of Experience (QoE) for multimedia. They enable researchers to conduct experiments with a more diverse set of participants and at a lower economic cost than conventional laboratory studies. However, a major challenge for crowdsourcing tests is the detection and control of outliers, which may arise due to different test conditions, human errors or abnormal variations in context. For this purpose, it is desired to develop a robust evaluation methodology to deal with crowdsourceable data, which are possibly incomplete, imbalanced, and distributed on a graph. In this paper, we propose a robust rating scheme based on robust regression and Hodge Decomposition on graphs, to assess QoE using crowdsourcing. The scheme shows that the removal of outliers in crowdsourcing experiments would be helpful for purifying data and could provide us with more reliable results. The effectiveness of the proposed scheme is further confirmed by experimental studies on both simulated examples and real-world data.