A look back and a look forward
SIGIR '88 Proceedings of the 11th annual international ACM SIGIR conference on Research and development in information retrieval
Evaluation of the user interface in an information retrieval system: a model
Information Processing and Management: an International Journal
Evaluation issues in information retrieval
Information Processing and Management: an International Journal - Special issue on evaluation issues in information retrieval
Evaluation measures for interactive information retrieval
Information Processing and Management: an International Journal - Special issue on evaluation issues in information retrieval
Braque: design of an interface to support user interaction in information retrieval
Information Processing and Management: an International Journal - Special issue on hypertext and information retrieval
Evaluation of evaluation in information retrieval
SIGIR '95 Proceedings of the 18th annual international ACM SIGIR conference on Research and development in information retrieval
Information Retrieval Experiment
Information Retrieval Experiment
The added value of task and ontology-based markup for information retrieval
Journal of the American Society for Information Science and Technology
Hi-index | 0.00 |
The standard criteria for evaluation of information retrieval (IR) systems: effectiveness, efficiency, usability, satisfaction, cost-benefit seem as applicable to the interactive multimedia context as to the non-interactive, text-based context in which they have been developed. However, the operationalizations, measures and methods developed in the traditional context are, for a variety of reasons, almost wholly inadequate for the new context. This paper discusses some of the problematic aspects of evaluation in this new context, and suggests some strategies for developing new measures and methodologies for the evaluation of interactive multimedia IR systems.