Smoothing NDCG metrics using tied scores

  • Authors:
  • Andrey Kustarev;Yury Ustinovsky;Yury Logachev;Evgeny Grechnikov;Ilya Segalovich;Pavel Serdyukov

  • Affiliations:
  • Yandex, Moscow, Russian Fed.;Yandex, Moscow, Russian Fed.;Yandex, Moscow, Russian Fed.;Yandex, Moscow, Russian Fed.;Yandex, Moscow, Russian Fed.;Yandex, Moscow, Russian Fed.

  • Venue:
  • Proceedings of the 20th ACM international conference on Information and knowledge management
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

One of promising directions in research on learning to rank concerns the problem of appropriate choice of the objective function to maximize by means of machine learning algorithms. We describe a novel technique of smoothing an arbitrary ranking metric and demonstrate how to utilize it to maximize the retrieval quality in terms of the $NDCG$ metric. The idea behind our listwise ranking model called TieRank is artificial probabilistic tying of predicted relevance scores at each iteration of learning process, which defines a distribution on the set of all permutations of retrieved documents. Such distribution provides a desired smoothed version of the target retrieval quality metric. This smooth function is possible to maximize using a gradient descent method. Experiments on LETOR collections show that TieRank outperforms most of the existing learning to rank algorithms.