When the levee breaks: without bots, what happens to Wikipedia's quality control processes?

  • Authors:
  • R. Stuart Geiger;Aaron Halfaker

  • Affiliations:
  • University of California, Berkeley, CA;University of Minnesota, Minneapolis, Minnesota

  • Venue:
  • Proceedings of the 9th International Symposium on Open Collaboration
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the first half of 2011, ClueBot NG -- one of the most prolific counter-vandalism bots in the English-language Wikipedia -- went down for four distinct periods, each period of downtime lasting from days to weeks. In this paper, we use these periods of breakdown as naturalistic experiments to study Wikipedia's heterogeneous quality control network, which we analyze as a multi-tiered system in which distinct classes of reviewers use various reviewing technologies to patrol for different kinds of damage at staggered time periods. Our analysis showed that the overall time-to-revert edits was almost doubled when this software agent was down. Yet while a significantly fewer proportion of edits made during the bot's downtime were reverted, we found that those edits were later eventually reverted. This suggests that other agents in Wikipedia took over this quality control work, but performed it at a far slower rate.