Cumulated gain-based evaluation of IR techniques
ACM Transactions on Information Systems (TOIS)
Learning to rank using gradient descent
ICML '05 Proceedings of the 22nd international conference on Machine learning
User performance versus precision measures for simple search tasks
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
Rank-biased precision for measurement of retrieval effectiveness
ACM Transactions on Information Systems (TOIS)
Relative effect of spam and irrelevant documents on user interaction with search engines
Proceedings of the 20th ACM international conference on Information and knowledge management
Models and metrics: IR evaluation as a user process
Proceedings of the Seventeenth Australasian Document Computing Symposium
Users versus models: what observation tells us about effectiveness metrics
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Hi-index | 0.00 |
Deliberate degradation of search results is a common tool in user experiments. We degrade high-quality search results by inserting non-relevant documents at different ranks. The effect of these manipulations, on a number of commonly-used metrics, is counter-intuitive: the discount functions implicit in P@k, MRR, NDCG, and others do not account for the true relationship between rank and value to the user. We propose an alternative, based on visibility data.