Annual review of information science and technology, vol. 22
Experiences from the Sigcomm 2005 European shadow PC experiment
ACM SIGCOMM Computer Communication Review
Author feedback experiment at PAM 2007
ACM SIGCOMM Computer Communication Review
ACM SIGCOMM Computer Communication Review
What ought a program committee to do?
WOWCS'08 Proceedings of the conference on Organizing Workshops, Conferences, and Symposia for Computer Systems
Thoughts on improving review quality
WOWCS'08 Proceedings of the conference on Organizing Workshops, Conferences, and Symposia for Computer Systems
Towards a model of computer systems research
WOWCS'08 Proceedings of the conference on Organizing Workshops, Conferences, and Symposia for Computer Systems
Viewpoint: Scaling the academic publication process to internet scale
Communications of the ACM - Rural engineering development
Paper and proposal reviews: is the process flawed?
ACM SIGMOD Record
Viewpoint: Program committee overload in systems
Communications of the ACM - Security in the Browser
Communications of the ACM
Rebooting the CS publication process
Communications of the ACM
Hi-index | 0.00 |
The computer science research paper review process is largely human and time-intensive. More worrisome, review processes are frequently questioned, and often non-transparent. This work advocates applying computer science methods and tools to the computer science review process. As an initial exploration, we data mine the submissions, bids, reviews, and decisions from a recent top-tier computer networking conference. We empirically test several common hypotheses, including the existence of readability, citation, call-for-paper adherence, and topical bias. From our findings, we hypothesize review process methods to improve fairness, efficiency, and transparency.