Financial incentives and the "performance of crowds"
Proceedings of the ACM SIGKDD Workshop on Human Computation
Crowdsourcing graphical perception: using mechanical turk to assess visualization design
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Crowdsourcing, collaboration and creativity
XRDS: Crossroads, The ACM Magazine for Students - Comp-YOU-Ter
Designing incentives for inexpert human raters
Proceedings of the ACM 2011 conference on Computer supported cooperative work
Shepherding the crowd: managing and providing feedback to crowd workers
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 2013 conference on Computer supported cooperative work
Crowdsourcing performance evaluations of user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Crowdsourcing is emerging as an effective method for performing tasks that require human abilities, such as tagging photos, transcribing handwriting and categorising data. Crowd workers perform small chunks of larger tasks in return for a reward, which is generally monetary. Reward can be one factor for motivating workers to produce higher quality results. Yet, as highlighted by previous research, the task design, in terms of its instructions and user interface, can also affect the workers' perception of the task, thus affecting the quality of results. In this study we investigate both factors, reward and task design, to better understand their role in relation to the quality of work in crowdsourcing. In Experiment 1 we test a variety of reward schemas while in Experiment 2 we measure the effects of the complexity of tasks and interface on attention. The long-term goal is to establish guidelines for designing tasks with the aim to maximize workers' performance.