Using statistical testing in the evaluation of retrieval experiments
SIGIR '93 Proceedings of the 16th annual international ACM SIGIR conference on Research and development in information retrieval
The Cranfield tests on index language devices
Readings in information retrieval
The Lowell database research self-assessment
Communications of the ACM - Adaptive complex enterprises
The importance of scientific data curation for evaluation campaigns
DELOS'07 Proceedings of the 1st international conference on Digital libraries: research and development
CLEF 2005: ad hoc track overview
CLEF'05 Proceedings of the 6th international conference on Cross-Language Evalution Forum: accessing Multilingual Information Repositories
Scientific evaluation of a DLMS: a service for evaluating information access components
ECDL'06 Proceedings of the 10th European conference on Research and Advanced Technology for Digital Libraries
DIRECT: a system for evaluating information access components of digital libraries
ECDL'05 Proceedings of the 9th European conference on Research and Advanced Technology for Digital Libraries
The importance of scientific data curation for evaluation campaigns
DELOS'07 Proceedings of the 1st international conference on Digital libraries: research and development
DIRECTions: design and specification of an IR evaluation infrastructure
CLEF'12 Proceedings of the Third international conference on Information Access Evaluation: multilinguality, multimodality, and visual analytics
Hi-index | 0.00 |
This paper examines the current way of keeping the data produced during the evaluation campaigns and highlights some shortenings of it. As a consequence, we propose a new approach for improving the management evaluation campaigns' data. In this approach, the data are considered as scientific data to be cured and enriched in order to give full support to longitudinal statistical studies and long-term preservation.