Cumulated gain-based evaluation of IR techniques
ACM Transactions on Information Systems (TOIS)
A regression framework for learning ranking functions using relative relevance judgments
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
SoftRank: optimizing non-smooth rank metrics
WSDM '08 Proceedings of the 2008 International Conference on Web Search and Data Mining
IntervalRank: isotonic regression with listwise and pairwise constraints
Proceedings of the third ACM international conference on Web search and data mining
Gradient descent optimization of smoothed information retrieval metrics
Information Retrieval
Cross-market model adaptation with pairwise preference data for web search ranking
COLING '10 Proceedings of the 23rd International Conference on Computational Linguistics: Posters
Hi-index | 0.00 |
Discounted cumulative gain (DCG) is widely used for evaluating ranking functions. It is therefore natural to learn a ranking function that directly optimizes DCG. However, DCG is non-smooth, rendering gradient-based optimization algorithms inapplicable. To remedy this, smoothed versions of DCG have been proposed but with only partial success. In this paper, we first present analysis that shows it is ineffective using the gradient of the smoothed DCG to drive the optimization algorithm. We then propose a novel approach, SHF-SDCG, for smoothing DCG by using smoothed hinge functions (SHF). It has the advantage of seamlessly transition from driving the optimization mimicking pairwise learning when the ranking function does not fit the data well, to driving the optimization using DCG when the ranking function becomes more accurate. SHF-SDCG is then extended to REG-SHF-SDCG, an algorithm which gradually transits from pointwise and pairwise to listwise learning. Finally experimental results are provided to validate the effectiveness of SHF-SDCG and REG-SHF-SDCG.