Finding a good query-related topic for boosting pseudo-relevance feedback

  • Authors:
  • Zheng Ye;Jimmy Xiangji Huang;Hongfei Lin

  • Affiliations:
  • Department of Computer Science and Engineering, Dalian University of Technology, Dalian, People's Republic of China and School of Information Technology, York University, Toronto, Canada;School of Information Technology, York University, Toronto, Canada;Department of Computer Science and Engineering, Dalian University of Technology, Dalian, People's Republic of China

  • Venue:
  • Journal of the American Society for Information Science and Technology
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Pseudo-relevance feedback (PRF) via query expansion (QE) assumes that the top-ranked documents from the first-pass retrieval are relevant. The most informative terms in the pseudo-relevant feedback documents are then used to update the original query representation in order to boost the retrieval performance. Most current PRF approaches estimate the importance of the candidate expansion terms based on their statistics on document level. However, a document for PRF may consist of different topics, which may not be all related to the query even if the document is judged relevant. The main argument of this article is the proposal to conduct PRF on a granularity smaller than on the document level. In this article, we propose a topic-based feedback model with three different strategies for finding a good query-related topic based on the Latent Dirichlet Allocation model. The experimental results on four representative TREC collections show that QE based on the derived topic achieves statistically significant improvements over a strong feedback model in the language modeling framework, which updates the query representation based on the top-ranked documents. © 2011 Wiley Periodicals, Inc.