Trading convexity for scalability

  • Authors:
  • Ronan Collobert;Fabian Sinz;Jason Weston;Léon Bottou

  • Affiliations:
  • NEC Labs America, Princeton NJ;NEC Labs America, Princeton NJ and Max Planck Insitute for Biological Cybernetics, Tuebingen, Germany;NEC Labs America, Princeton NJ;NEC Labs America, Princeton NJ

  • Venue:
  • ICML '06 Proceedings of the 23rd international conference on Machine learning
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Convex learning algorithms, such as Support Vector Machines (SVMs), are often seen as highly desirable because they offer strong practical properties and are amenable to theoretical analysis. However, in this work we show how non-convexity can provide scalability advantages over convexity. We show how concave-convex programming can be applied to produce (i) faster SVMs where training errors are no longer support vectors, and (ii) much faster Transductive SVMs.