The Cranfield tests on index language devices
Readings in information retrieval
Using titles and category names from editor-driven taxonomies for automatic evaluation
CIKM '03 Proceedings of the twelfth international conference on Information and knowledge management
TREC: Experiment and Evaluation in Information Retrieval (Digital Libraries and Electronic Publishing)
Journal of Systems and Software
Building simulated queries for known-item topics: an analysis using six european languages
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
On the history of evaluation in IR
Journal of Information Science
Workflows and e-Science: An overview of workflow system features and capabilities
Future Generation Computer Systems
Methods for Evaluating Interactive Information Retrieval Systems with Users
Foundations and Trends in Information Retrieval
Service Science, Management and Engineering: Education for the 21st Century
Service Science, Management and Engineering: Education for the 21st Century
Improvements that don't add up: ad-hoc retrieval results since 1998
Proceedings of the 18th ACM conference on Information and knowledge management
Collaborative Information Retrieval in an information-intensive domain
Information Processing and Management: an International Journal
Scientific workflows and clouds
Crossroads - Plugging Into the Cloud
The importance of scientific data curation for evaluation campaigns
DELOS'07 Proceedings of the 1st international conference on Digital libraries: research and development
Information interaction in molecular medicine: integrated use of multiple channels
Proceedings of the third symposium on Information interaction in context
Bioinformatics
CLEF'10 Proceedings of the 2010 international conference on Multilingual and multimodal information access evaluation: cross-language evaluation forum
Validating query simulators: an experiment using commercial searches and purchases
CLEF'10 Proceedings of the 2010 international conference on Multilingual and multimodal information access evaluation: cross-language evaluation forum
Automated component-level evaluation: present and future
CLEF'10 Proceedings of the 2010 international conference on Multilingual and multimodal information access evaluation: cross-language evaluation forum
A PROMISE for experimental evaluation
CLEF'10 Proceedings of the 2010 international conference on Multilingual and multimodal information access evaluation: cross-language evaluation forum
Crowdsourcing for search evaluation
ACM SIGIR Forum
Visual exploration of stream pattern changes using a data-driven framework
ISVC'10 Proceedings of the 6th international conference on Advances in visual computing - Volume Part II
The scholarly impact of TRECVid (2003–2009)
Journal of the American Society for Information Science and Technology
Pseudo test collections for learning web search ranking functions
Proceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval
Information Retrieval Evaluation
Information Retrieval Evaluation
CLEF'11 Proceedings of the Second international conference on Multilingual and multimodal information access evaluation
Assessing the scholarly impact of imageCLEF
CLEF'11 Proceedings of the Second international conference on Multilingual and multimodal information access evaluation
Crowdsourcing for information retrieval
ACM SIGIR Forum
Visual interactive failure analysis: supporting users in information retrieval evaluation
Proceedings of the 4th Information Interaction in Context Symposium
Ground truth generation in medical imaging: a crowdsourcing-based iterative approach
Proceedings of the ACM multimedia 2012 workshop on Crowdsourcing for multimedia
CLEF'12 Proceedings of the Third international conference on Information Access Evaluation: multilinguality, multimodality, and visual analytics
DIRECTions: design and specification of an IR evaluation infrastructure
CLEF'12 Proceedings of the Third international conference on Information Access Evaluation: multilinguality, multimodality, and visual analytics
Visual analytics and information retrieval
PROMISE'12 Proceedings of the 2012 international conference on Information Retrieval Meets Information Visualization
Hi-index | 0.00 |
The PROMISE network of excellence organized a two-days brainstorming workshop on 30th and 31st May 2012 in Padua, Italy, to discuss and envisage future directions and perspectives for the evaluation of information access and retrieval systems in multiple languages and multiple media. This document reports on the outcomes of this event and provides details about the six envisaged research lines: search applications; contextual evaluation; challenges in test collection design and exploitation; component-based evaluation; ongoing evaluation; and signal-aware evaluation. The ultimate goal of the PROMISE retreat is to stimulate and involve the research community along these research lines and to provide funding agencies with effective and scientifically sound ideas for coordinating and supporting information access research.