Repeatability and workability evaluation of SIGMOD 2011

  • Authors:
  • Philippe Bonnet;Stefan Manegold;Matias Bjørling;Wei Cao;Javier Gonzalez;Joel Granados;Nancy Hall;Stratos Idreos;Milena Ivanova;Ryan Johnson;David Koop;Tim Kraska;René Müller;Dan Olteanu;Paolo Papotti;Christine Reilly;Dimitris Tsirogiannis;Cong Yu;Juliana Freire;Dennis Shasha

  • Affiliations:
  • ITU, Denmark;CWI, Netherlands;ITU, Denmark;Remnin University, China;ITU, Denmark;ITU, Denmark;University of Wisconsin, USA;CWI, Netherlands;CWI, Netherlands;University of Toronto, Canada;University of Utah, USA;UC Berkeley, USA;IBM Almaden, USA;Oxford University, UK;Università Roma Tre, Italy;University of Texas Pan Am, USA;Microsoft, USA;Google, USA;University of Utah, USA;New York University, USA

  • Venue:
  • ACM SIGMOD Record
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

SIGMOD has offered, since 2008, to verify the experiments published in the papers accepted at the conference. This year, we have been in charge of reproducing the experiments provided by the authors (repeatability), and exploring changes to experiment parameters (workability). In this paper, we assess the SIGMOD repeatability process in terms of participation, review process and results. While the participation is stable in terms of number of submissions, we find this year a sharp contrast between the high participation from Asian authors and the low participation from American authors. We also find that most experiments are distributed as Linux packages accompanied by instructions on how to setup and run the experiments. We are still far from the vision of executable papers.