Is seeing believing?: how recommender system interfaces affect users' opinions
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
IPTPS '01 Revised Papers from the First International Workshop on Peer-to-Peer Systems
Using social psychology to motivate contributions to online communities
CSCW '04 Proceedings of the 2004 ACM conference on Computer supported cooperative work
How oversight improves member-maintained communities
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The Wealth of Networks: How Social Production Transforms Markets and Freedom
The Wealth of Networks: How Social Production Transforms Markets and Freedom
Eliciting Informative Feedback: The Peer-Prediction Method
Management Science
The CKC Challenge: Exploring Tools for Collaborative Knowledge Construction
IEEE Intelligent Systems
OntoWiki – a tool for social, semantic collaboration
ISWC'06 Proceedings of the 5th international conference on The Semantic Web
Hi-index | 0.00 |
Creating and maintaining semantic structures such as ontologies on a large scale is a labor-intensive task, which a sole individual cannot perform. Established automated solutions for this task do not yet exist. Peer production is a promising approach to create structured knowledge: Members of an online community create and maintain semantic structures collaboratively. To motivate members to participate and to ensure the quality of the data, rating-based incentive mechanisms are promising. Members mutually rate the quality of their contributions and are rewarded for good contributions and truthful ratings. Until now, there has been no systematic evaluation of such rating mechanisms in the context of structured knowledge. We have developed a platform for the collaborative creation of semantic structures. To evaluate the effect of ratings and incentive mechanisms on the quality of peer-produced data, we have conducted an extensive empirical study in an online community. We show that ratings are a reliable measure of the quality of contributions by comparing user ratings with an ex post evaluation by experts. Further experimental results are that incentive mechanisms increase the quality of contributions. We conclude that ratings and incentive mechanisms are promising to foster and improve the peer production of structured knowledge.