Approximate reduction from AUC maximization to 1-norm soft margin optimization

  • Authors:
  • Daiki Suehiro;Kohei Hatano;Eiji Takimoto

  • Affiliations:
  • Department of Informatics, Kyushu University;Department of Informatics, Kyushu University;Department of Informatics, Kyushu University

  • Venue:
  • ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Finding linear classifiers that maximize AUC scores is important in ranking research. This is naturally formulated as a 1-norm hard/soft margin optimization problem over pn pairs of p positive and n negative instances. However, directly solving the optimization problems is impractical since the problem size (pn) is quadratically larger than the given sample size (p + n). In this paper, we give (approximate) reductions from the problems to hard/soft margin optimization problems of linear size. First, for the hard margin case, we show that the problem is reduced to a hard margin optimization problem over p + n instances in which the bias constant term is to be optimized. Then, for the soft margin case, we show that the problem is approximately reduced to a soft margin optimization problem over p + n instances for which the resulting linear classifier is guaranteed to have a certain margin over pairs.