An efficient boosting algorithm for combining preferences
The Journal of Machine Learning Research
Convex Optimization
Learning to rank using gradient descent
ICML '05 Proceedings of the 22nd international conference on Machine learning
Linear feature-based models for information retrieval
Information Retrieval
Learning to rank: from pairwise approach to listwise approach
Proceedings of the 24th international conference on Machine learning
A support vector method for optimizing average precision
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
FRank: a ranking method with fidelity loss
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
AdaRank: a boosting algorithm for information retrieval
SIGIR '07 Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval
SoftRank: optimizing non-smooth rank metrics
WSDM '08 Proceedings of the 2008 International Conference on Web Search and Data Mining
Query-level loss functions for information retrieval
Information Processing and Management: an International Journal
Listwise approach to learning to rank: theory and algorithm
Proceedings of the 25th international conference on Machine learning
Directly optimizing evaluation measures in learning to rank
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Structured learning for non-smooth ranking losses
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Journal of Artificial Intelligence Research
Learning to rank from Bayesian decision inference
Proceedings of the 18th ACM conference on Information and knowledge management
The P-Norm Push: A Simple Convex Ranking Algorithm that Concentrates at the Top of the List
The Journal of Machine Learning Research
Gradient descent optimization of smoothed information retrieval metrics
Information Retrieval
Adapting boosting for information retrieval measures
Information Retrieval
LETOR: A benchmark collection for research on learning to rank for information retrieval
Information Retrieval
Parallel boosted regression trees for web search ranking
Proceedings of the 20th international conference on World wide web
Boosting: Foundations and Algorithms
Boosting: Foundations and Algorithms
Hi-index | 0.00 |
We present a novel learning algorithm, DirectRank, which directly and exactly optimizes ranking measures without resorting to any upper bounds or approximations. Our approach is essentially an iterative coordinate ascent method. In each iteration, we choose one coordinate and only update the corresponding parameter, with all others remaining fixed. Since the ranking measure is a stepwise function of a single parameter, we propose a novel line search algorithm that can locate the interval with the best ranking measure along this coordinate quite efficiently. In order to stabilize our system in small datasets, we construct a probabilistic framework for document-query pairs to maximize the likelihood of the objective permutation of top-$\tau$ documents. This iterative procedure ensures convergence. Furthermore, we integrate regression trees as our weak learners in order to consider the correlation between the different features. Experiments on LETOR datasets and two large datasets, Yahoo challenge data and Microsoft 30K web data, show an improvement over state-of-the-art systems.