SIGIR '92 Proceedings of the 15th annual international ACM SIGIR conference on Research and development in information retrieval
OHSUMED: an interactive retrieval evaluation and new large test collection for research
SIGIR '94 Proceedings of the 17th annual international ACM SIGIR conference on Research and development in information retrieval
Analyses of multiple evidence combination
Proceedings of the 20th annual international ACM SIGIR conference on Research and development in information retrieval
Improving the effectiveness of information retrieval with local context analysis
ACM Transactions on Information Systems (TOIS)
An information-theoretic approach to automatic query expansion
ACM Transactions on Information Systems (TOIS)
Improving retrieval feedback with multiple term-ranking function combination
ACM Transactions on Information Systems (TOIS)
Tuning before feedback: combining ranking discovery and blind feedback for robust retrieval
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Implicit feedback for interactive information retrieval
ACM SIGIR Forum
Simplified similarity scoring using term ranks
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Assessing the term independence assumption in blind relevance feedback
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
On the use of negation in Boolean IR queries
Information Processing and Management: an International Journal
Hi-index | 0.00 |
In this study, we performed a comprehensive evaluation of pseudorelevance feedback technique for automatic query expansion using OHSUMED test collection. The well-known term sorting methods for the selection of expansion terms were tested in our experiments. We also proposed a new term reweighting method for further performance improvements. Through the multiple sets of test, we suggested that local context analysis was probably the most effective method of selecting good expansion terms from a set of MEDLINE documents given enough feedback documents. Both term sorting and term reweighting method might need to be carefully considered to achieve maximum performance improvements.