Cumulated gain-based evaluation of IR techniques
ACM Transactions on Information Systems (TOIS)
Optimizing search engines using clickthrough data
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
An efficient boosting algorithm for combining preferences
The Journal of Machine Learning Research
Error limiting reductions between classification tasks
ICML '05 Proceedings of the 22nd international conference on Machine learning
Learning to rank using gradient descent
ICML '05 Proceedings of the 22nd international conference on Machine learning
Magnitude-preserving ranking algorithms
Proceedings of the 24th international conference on Machine learning
Robust sparse rank learning for non-smooth ranking measures
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Robust reductions from ranking to classification
COLT'07 Proceedings of the 20th annual conference on Learning theory
ALT'09 Proceedings of the 20th international conference on Algorithmic learning theory
Sensitive error correcting output codes
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Statistical Analysis of Bayes Optimal Subset Ranking
IEEE Transactions on Information Theory
Hi-index | 0.00 |
A widespread idea to attack ranking works by reducing it into a set of binary preferences and applying well studied classification techniques. The basic question addressed in this paper relates to whether an accurate classifier would transfer directly into a good ranker. In particular, we explore this reduction for subset ranking, which is based on optimization of DCG metric (Discounted Cumulated Gain), a standard position-sensitive performance measure. We propose a consistent reduction framework, guaranteeing that the minimal DCG regret is achievable by learning pairwise preferences assigned with importance weights. This fact allows us to further develop a novel upper bound on the DCG regret in terms of pairwise regrets. Empirical studies on benchmark datasets validate the proposed reduction approach with improved performance.