Understanding computers and cognition
Understanding computers and cognition
Comparing retrieval performance in online data bases
Information Processing and Management: an International Journal
Generating an individualized user interface
SIGIR '87 Proceedings of the 10th annual international ACM SIGIR conference on Research and development in information retrieval
Introduction to Modern Information Retrieval
Introduction to Modern Information Retrieval
Search improvement via automatic query reformulation
ACM Transactions on Information Systems (TOIS) - Special issue on research and development in information retrieval
Incremental relevance feedback
SIGIR '92 Proceedings of the 15th annual international ACM SIGIR conference on Research and development in information retrieval
Controlling the complexity in comparing search user interfaces via user studies
Information Processing and Management: an International Journal
Hi-index | 0.00 |
Planning the evaluation of an information retrieval system involves two steps: first, a determination of performance descriptors and measures appropriate to the system objectives and, secondly, a development of an evaluation design which ensures the effect of variation in components of interest will be isolated and assessed in an unbiased fashion. This paper examines the question of retrieval system evaluation from the perspective of the user. It presents evaluation procedures which are appropriate to this perspective and which can be used to isolate the effect of variation in the user interface to the system. The general procedure is exemplified by an application to evaluation of an experimental OPAC interface.