Webs of Archived Distributed Computations for Asynchronous Collaboration
The Journal of Supercomputing - Special issue: high performance distributed computing
The repeatability experiment of SIGMOD 2008
ACM SIGMOD Record
Guest Editors' Introduction: Reproducible Research
Computing in Science and Engineering
Reproducible Research in Computational Harmonic Analysis
Computing in Science and Engineering
The Legal Framework for Reproducible Scientific Research: Licensing and Copyright
Computing in Science and Engineering
VisMashup: Streamlining the Creation of Custom Visualization Applications
IEEE Transactions on Visualization and Computer Graphics
Repeatability & workability evaluation of SIGMOD 2009
ACM SIGMOD Record
CrowdLabs: social analysis and visualization for the sciences
SSDBM'11 Proceedings of the 23rd international conference on Scientific and statistical database management
Repeatability and workability evaluation of SIGMOD 2011
ACM SIGMOD Record
Visualizing a Journal that Serves the Computational Sciences Community
Computing in Science and Engineering
Making Computations and Publications Reproducible with VisTrails
Computing in Science and Engineering
ReproZip: using provenance to support computational reproducibility
TaPP'13 Proceedings of the 5th USENIX conference on Theory and Practice of Provenance
Using provenance for repeatability
TaPP'13 Proceedings of the 5th USENIX conference on Theory and Practice of Provenance
ReproZip: using provenance to support computational reproducibility
Proceedings of the 5th USENIX Workshop on the Theory and Practice of Provenance
Using provenance for repeatability
Proceedings of the 5th USENIX Workshop on the Theory and Practice of Provenance
Hi-index | 0.00 |
Computational experiments have become an integral part of the scientific method, but reproducing, archiving, and querying them is still a challenge. The first barrier to a wider adoption is the fact that it is hard both for authors to derive a compendium that encapsulates all the components needed to reproduce a result and for reviewers to verify the results. In this tutorial, we will present a series of guidelines and, through hands-on examples, review existing tools to help authors create of reproducible results. We will also outline open problems and new directions for database-related research having to do with querying computational experiments.