The significance of the Cranfield tests on index languages
SIGIR '91 Proceedings of the 14th annual international ACM SIGIR conference on Research and development in information retrieval
Blind Men and Elephants: Six Approaches to TREC data
Information Retrieval
Exploratory and Multivariate Data Analysis
Exploratory and Multivariate Data Analysis
Information Retrieval
The Philosophy of Information Retrieval Evaluation
CLEF '01 Revised Papers from the Second Workshop of the Cross-Language Evaluation Forum on Evaluation of Cross-Language Information Retrieval Systems
System Performance and Natural Language Expression of Information Needs
Information Retrieval
Hi-index | 0.00 |
In this article, the statistical principal components analysis (PCA) is proposed as a method for performance comparisons of different retrieval strategies. It is shown that the PCA method can reveal implicit performance relations among retrieval systems across information needs (i.e., queries, topics). For illustration, the TREC 12 robust track data have been reevaluated by the PCA method and have been shown to reveal easily the performance relations that are hard to see with traditional techniques. Therefore, PCA promises a uniform evaluation framework that can be used for large-scale evaluation of retrieval experiments. In addition to the mean average precision (MAP) measure, relative analytic distance (RAD) is proposed as a new performance summary measure based on the same notion introduced by PCA. © 2007 Wiley Periodicals, Inc.