Information visualization: perception for design
Information visualization: perception for design
Crowdsourcing user studies with Mechanical Turk
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Cheap and fast---but is it good?: evaluating non-expert annotations for natural language tasks
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Crowdsourcing, attention and productivity
Journal of Information Science
Are your participants gaming the system?: screening mechanical turk workers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Financial incentives and the "performance of crowds"
ACM SIGKDD Explorations Newsletter
Soylent: a word processor with a crowd inside
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Designing incentives for inexpert human raters
Proceedings of the ACM 2011 conference on Computer supported cooperative work
Managing crowdsourced human computation: a tutorial
Proceedings of the 20th international conference companion on World wide web
CrowdForge: crowdsourcing complex work
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Social transparency in networked information exchange: a theoretical framework
Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work
Collaboratively crowdsourcing workflows with turkomatic
Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work
Building Successful Online Communities: Evidence-Based Social Design
Building Successful Online Communities: Evidence-Based Social Design
A comparison of social, learning, and financial strategies on crowd engagement and output quality
Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing
Hi-index | 0.00 |
Workers in microtask work environments such as Mechanical Turk typically do not know if or how they fit into a workflow. The research question we posed here was whether displaying information about the number of other workers doing the same task would motivate better or poorer work quality. In experiment 1, we varied the information about co-workers presented to the worker and the number of his or her co-workers: "you" or "you alone" are doing a task, or "you" plus 5, 15, or 50 co-workers. We compared these conditions with a no-social information control. In experiment 2, we crossed the number of co-workers (5 vs. 50) with the type of incentive (individual or group). Results show that visual presentations of co-workers changed workers' perceptions of co-workers, and that the more co-workers participants perceived, the lower their work quality. We suggest future work to determine the kinds of co-worker information that will reduce or increase work quality in microtask settings.