Why and Where: A Characterization of Data Provenance
ICDT '01 Proceedings of the 8th International Conference on Database Theory
Data Provenance: Some Basic Issues
FST TCS 2000 Proceedings of the 20th Conference on Foundations of Software Technology and Theoretical Computer Science
Modular Domain Specific Languages and Tools
ICSR '98 Proceedings of the 5th International Conference on Software Reuse
PlanetLab: an overlay testbed for broad-coverage services
ACM SIGCOMM Computer Communication Review
A survey of data provenance in e-science
ACM SIGMOD Record
Grid'5000: A Large Scale and Highly Reconfigurable Grid Experimental Testbed
GRID '05 Proceedings of the 6th IEEE/ACM International Workshop on Grid Computing
A survey of system configuration tools
LISA'10 Proceedings of the 24th international conference on Large installation system administration
CDE: run any Linux application on-demand without installation
LISA'11 Proceedings of the 25th international conference on Large Installation System Administration
Hi-index | 0.00 |
In the scientific experimentation process, an experiment result needs to be analyzed and compared with several others, potentially obtained in different conditions. Thus, the experimenter needs to be able to redo the experiment. Several tools are dedicated to the control of the experiment input parameters and the experiment replay. In parallel concurrent and distributed systems, experiment conditions are not only restricted to the input parameters, but also to the software environment in which the experiment was carried out. It is therefore essential to be able to reconstruct this type of environment. The task can quickly become complex for experimenters, particularly on research platforms dedicated to scientific experimentation, where both hardware and software are in constant rapid evolution. This article discusses the concept of the reconstructability of software environments and proposes a tool for dealing with this problem.