Crowdsourcing user studies with Mechanical Turk
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Saliency-enhanced image aesthetics class prediction
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Quadrant of euphoria: a crowdsourcing platform for QoE assessment
IEEE Network: The Magazine of Global Internetworking - Special issue on improving quality of experience for network services
Crowdsourcing systems on the World-Wide Web
Communications of the ACM
Anatomy of a Crowdsourcing Platform - Using the Example of Microworkers.com
IMIS '11 Proceedings of the 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing
Implicit experiences as a determinant of perceptual quality and aesthetic appreciation
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Quantification of YouTube QoE via Crowdsourcing
ISM '11 Proceedings of the 2011 IEEE International Symposium on Multimedia
Studying aesthetics in photographic images using a computational approach
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part III
What makes an image memorable?
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Crowdsourcing approach for evaluation of privacy filters in video surveillance
Proceedings of the ACM multimedia 2012 workshop on Crowdsourcing for multimedia
Hi-index | 0.00 |
Research on Quality of Experience (QoE) heavily relies on subjective evaluations of media. An important aspect of QoE concerns modeling and quantifying the subjective notions of 'beauty' (aesthetic appeal) and 'something well-known' (content recognizability), which are both subject to cultural and social effects. Crowdsourcing, which allows employing people worldwide to perform short and simple tasks via online platforms, can be a great tool for performing subjective studies in a time and cost-effective way. On the other hand, the crowdsourcing environment does not allow for the degree of experimental control which is necessary to guarantee reliable subjective data. To validate the use of crowdsourcing for QoE assessments, in this paper, we evaluate aesthetic appeal and recognizability of images using the Microworkers crowdsourcing platform and compare the outcomes with more conventional evaluations conducted in a controlled lab environment. We find high correlation between crowdsourcing and lab scores for recognizability but not for aesthetic appeal, indicating that crowdsourcing can be used for QoE subjective assessments as long as the workers' tasks are designed with extreme care to avoid misinterpretations.