Towards interactive query expansion
SIGIR '88 Proceedings of the 11th annual international ACM SIGIR conference on Research and development in information retrieval
Relevance feedback and other query modification techniques
Information retrieval
SIGIR '92 Proceedings of the 15th annual international ACM SIGIR conference on Research and development in information retrieval
Effectiveness of query expansion in ranked-output document retrieval systems
Journal of Information Science
SIGIR '93 Proceedings of the 16th annual international ACM SIGIR conference on Research and development in information retrieval
Stemming algorithms: a case study for detailed evaluation
Journal of the American Society for Information Science - Special issue: evaluation of information retrieval systems
Evaluating interactive systems in TREC
Journal of the American Society for Information Science - Special issue: evaluation of information retrieval systems
Query expansion using local and global document analysis
SIGIR '96 Proceedings of the 19th annual international ACM SIGIR conference on Research and development in information retrieval
The impact of query structure and query expansion on retrieval performance
Proceedings of the 21st annual international ACM SIGIR conference on Research and development in information retrieval
Improving automatic query expansion
Proceedings of the 21st annual international ACM SIGIR conference on Research and development in information retrieval
Combining multiple evidence from different types of thesaurus for query expansion
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
Improving the effectiveness of information retrieval with local context analysis
ACM Transactions on Information Systems (TOIS)
Evaluating evaluation measure stability
SIGIR '00 Proceedings of the 23rd annual international ACM SIGIR conference on Research and development in information retrieval
Blind Men and Elephants: Six Approaches to TREC data
Information Retrieval
Searching the Web: the public and their queries
Journal of the American Society for Information Science and Technology
An information-theoretic approach to automatic query expansion
ACM Transactions on Information Systems (TOIS)
The text retrieval conferences (TRECS)
TIPSTER '98 Proceedings of a workshop on held at Baltimore, Maryland: October 13-15, 1998
A Handbook of Statistical Analyses using SAS, Third Edition
A Handbook of Statistical Analyses using SAS, Third Edition
Multi-Dimensional Evaluation of Information Retrieval Results
WI '04 Proceedings of the 2004 IEEE/WIC/ACM International Conference on Web Intelligence
Poison pills: harmful relevant documents in feedback
Proceedings of the 14th ACM international conference on Information and knowledge management
Journal of the American Society for Information Science and Technology
Journal of the American Society for Information Science and Technology
Effect of word density on measuring words association
COMPUTE '08 Proceedings of the 1st Bangalore Annual Compute Conference
Determining user's interest in real time
Proceedings of the 17th international conference on World Wide Web
On the number of terms used in automatic query expansion
Information Retrieval
Journal of Information Science
International Journal of Web and Grid Services
Hi-index | 0.00 |
Information retrieval performance evaluation is commonly made based on the classical recall and precision based figures or graphs. However, important information indicating causes for variation may remain hidden under the average recall and precision figures. Identifying significant causes for variation can help researchers and developers to focus on opportunities for improvement that underlay the averages. This article presents a case study showing the potential of a statistical repeated measures analysis of variance for testing the significance of factors in retrieval performance variation. The TREC-9 Query Track performance data is used as a case study and the factors studied are retrieval method, topic, and their interaction. The results show that retrieval method, topic, and their interaction are all significant. A topic level analysis is also made to see the nature of variation in the performance of retrieval methods across topics. The observed retrieval performances of expansion runs are truly significant improvements for most of the topics. Analyses of the effect of query expansion on document ranking confirm that expansion affects ranking positively.