Large scale manifold transduction

  • Authors:
  • Michael Karlen;Jason Weston;Ayse Erkan;Ronan Collobert

  • Affiliations:
  • NEC Labs America, Princeton, NJ and Ećole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland;NEC Labs America, Princeton, NJ;NEC Labs America, Princeton, NJ and New York University, New York, NY;NEC Labs America, Princeton, NJ

  • Venue:
  • Proceedings of the 25th international conference on Machine learning
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We show how the regularizer of Transductive Support Vector Machines (TSVM) can be trained by stochastic gradient descent for linear models and multi-layer architectures. The resulting methods can be trained online, have vastly superior training and testing speed to existing TSVM algorithms, can encode prior knowledge in the network architecture, and obtain competitive error rates. We then go on to propose a natural generalization of the TSVM loss function that takes into account neighborhood and manifold information directly, unifying the two-stage Low Density Separation method into a single criterion, and leading to state-of-the-art results.