Social translucence: an approach to designing systems that support social processes
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction in the new millennium, Part 1
Social translucence: designing social infrastructures that make collective activity visible
Communications of the ACM - Supporting community and building social capital
Identity disclosure and the creation of social capital
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Labeling images with a computer game
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Lifting the veil: improving accountability and social transparency in Wikipedia with wikidashboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing games with a purpose
Communications of the ACM - Designing games with a purpose
Can you ever trust a wiki?: impacting perceived trustworthiness in wikipedia
Proceedings of the 2008 ACM conference on Computer supported cooperative work
Crowdsourcing for relevance evaluation
ACM SIGIR Forum
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Cheap and fast---but is it good?: evaluating non-expert annotations for natural language tasks
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Financial incentives and the "performance of crowds"
ACM SIGKDD Explorations Newsletter
Quality management on Amazon Mechanical Turk
Proceedings of the ACM SIGKDD Workshop on Human Computation
Human computation: a survey and taxonomy of a growing field
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CrowdForge: crowdsourcing complex work
Proceedings of the 24th annual ACM symposium on User interface software and technology
Human Computation
Social transparency in networked information exchange: a theoretical framework
Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work
Shepherding the crowd yields better work
Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work
Enhancing reliability using peer consistency evaluation in human computation
Proceedings of the 2013 conference on Computer supported cooperative work
A comparison of social, learning, and financial strategies on crowd engagement and output quality
Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing
Hi-index | 0.01 |
This paper studied how social transparency and different peer-dependent reward schemes (i.e., individual, teamwork, and competition) affect the outcomes of crowdsourcing. The results showed that when social transparency was increased by asking otherwise anonymous workers to share their demographic information (e.g., name, nationality) to the paired worker, they performed significantly better. A more detailed analysis showed that in a teamwork reward scheme, in which the reward of the paired workers depended only on the collective outcomes, increasing social transparency could offset effects of social loafing by making them more accountable to their teammates. In a competition reward scheme, in which workers competed against each other and the reward depended on how much they outperformed their opponent, increasing social transparency could augment effects of social facilitation by providing more incentives for them to outperform their opponent. The results suggested that a careful combination of methods that increase social transparency and different reward schemes can significantly improve crowdsourcing outcomes.