The Eigentrust algorithm for reputation management in P2P networks
WWW '03 Proceedings of the 12th international conference on World Wide Web
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Modeling trust in collaborative information systems
COLCOM '07 Proceedings of the 2007 International Conference on Collaborative Computing: Networking, Applications and Worksharing
Us vs. Them: Understanding Social Dynamics in Wikipedia with Revert Graph Visualizations
VAST '07 Proceedings of the 2007 IEEE Symposium on Visual Analytics Science and Technology
The work of sustaining order in wikipedia: the banning of a vandal
Proceedings of the 2010 ACM conference on Computer supported cooperative work
Wikipedia vandalism detection: combining natural language, metadata, and reputation features
CICLing'11 Proceedings of the 12th international conference on Computational linguistics and intelligent text processing - Volume Part II
Reputation systems for open collaboration
Communications of the ACM
Finding patterns in behavioral observations by automatically labeling forms of wikiwork in Barnstars
Proceedings of the 7th International Symposium on Wikis and Open Collaboration
Don't bite the newbies: how reverts affect the quantity and quality of Wikipedia work
Proceedings of the 7th International Symposium on Wikis and Open Collaboration
Hi-index | 0.00 |
As the community depends more heavily on Wikipedia as a source of reliable information, the ability to quickly detect and remove detrimental information becomes increasingly important. The longer incorrect or malicious information lingers in a source perceived as reputable, the more likely that information will be accepted as correct and the greater the loss to source reputation. We present The Illiterate Editor (IllEdit), a content-agnostic, metadata-driven classification approach to Wikipedia revert detection. Our primary contribution is in building a metadata-based feature set for detecting edit quality, which is then fed into a Support Vector Machine for edit classification. By analyzing edit histories, the IllEdit system builds a profile of user behavior, estimates expertise and spheres of knowledge, and determines whether or not a given edit is likely to be eventually reverted. The success of the system in revert detection (0.844 F-measure) as well as its disjoint feature set as compared to existing, content-analyzing vandalism detection systems, shows promise in the synergistic usage of IllEdit for increasing the reliability of community information.