SimRank: a measure of structural-context similarity
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
The Journal of Machine Learning Research
He says, she says: conflict and coordination in Wikipedia
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The influence limiter: provably manipulation-resistant recommender systems
Proceedings of the 2007 ACM conference on Recommender systems
On ranking controversies in wikipedia: models and evaluation
WSDM '08 Proceedings of the 2008 International Conference on Web Search and Data Mining
A clustering coefficient for weighted networks, with application to gene expression data
AI Communications - Network Analysis in Natural Sciences and Engineering
On the credibility of wikipedia: an accessibility perspective
Proceedings of the 2nd ACM workshop on Information credibility on the web
Mopping up: modeling wikipedia promotion decisions
Proceedings of the 2008 ACM conference on Computer supported cooperative work
Automatic vandalism detection in Wikipedia
ECIR'08 Proceedings of the IR research, 30th European conference on Advances in information retrieval
Collective wisdom: information growth in wikis and blogs
Proceedings of the 11th ACM conference on Electronic commerce
Finding social roles in Wikipedia
Proceedings of the 2011 iConference
PLDA+: Parallel latent dirichlet allocation with data placement and pipeline processing
ACM Transactions on Intelligent Systems and Technology (TIST)
Who moderates the moderators?: crowdsourcing abuse detection in user-generated content
Proceedings of the 12th ACM conference on Electronic commerce
Hi-index | 0.00 |
Our reliance on networked, collectively built information is a vulnerability when the quality or reliability of this information is poor. Wikipedia, one such collectively built information source, is often our first stop for information on all kinds of topics; its quality has stood up to many tests, and it prides itself on having a "Neutral Point of View". Enforcement of neutrality is in the hands of comparatively few, powerful administrators. We find a surprisingly large number of editors who change their behavior and begin focusing more on a particular controversial topic once they are promoted to administrator status. The conscious and unconscious biases of these few, but powerful, administrators may be shaping the information on many of the most sensitive topics on Wikipedia; some may even be explicitly infiltrating the ranks of administrators in order to promote their own points of view. Neither prior history nor vote counts during an administrator's election can identify those editors most likely to change their behavior in this suspicious manner. We find that an alternative measure, which gives more weight to influential voters, can successfully reject these suspicious candidates. This has important implications for how we harness collective intelligence: even if wisdom exists in a collective opinion (like a vote), that signal can be lost unless we carefully distinguish the true expert voter from the noisy or manipulative voter.