Robust sparse rank learning for non-smooth ranking measures

  • Authors:
  • Zhengya Sun;Tao Qin;Qing Tao;Jue Wang

  • Affiliations:
  • Institute of Automation Chinese Academy of Sciences, Beijing, China;Microsoft Research Asia, Beijing, China;Institute of Automation Chinese Academy of Sciences, Beijing, China;Institute of Automation Chinese Academy of Sciences, Beijing, China

  • Venue:
  • Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently increasing attention has been focused on directly optimizing ranking measures and inducing sparsity in learning models. However, few attempts have been made to relate them together in approaching the problem of learning to rank. In this paper, we consider the sparse algorithms to directly optimize the Normalized Discounted Cumulative Gain (NDCG) which is a widely-used ranking measure. We begin by establishing a reduction framework under which we reduce ranking, as measured by NDCG, to the importance weighted pairwise classification. Furthermore, we provide a sound theoretical guarantee for this reduction, bounding the realized NDCG regret in terms of a properly weighted pairwise classification regret, which implies that good performance can be robustly transferred from pairwise classification to ranking. Based on the converted pairwise loss function, it is conceivable to take into account sparsity in ranking models and to come up with a gradient possessing certain performance guarantee. For the sake of achieving sparsity, a novel algorithm named RSRank has also been devised, which performs L1 regularization using truncated gradient descent. Finally, experimental results on benchmark collection confirm the significant advantage of RSRank in comparison with several baseline methods.