Large-margin classification in infinite neural networks

  • Authors:
  • Youngmin Cho;Lawrence K. Saul

  • Affiliations:
  • -;-

  • Venue:
  • Neural Computation
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce a new family of positive-definite kernels for large margin classification in support vector machines (SVMs). These kernels mimic the computation in large neural networks with one layer of hidden units. We also show how to derive new kernels, by recursive composition, that may be viewed as mapping their inputs through a series of nonlinear feature spaces. These recursively derived kernels mimic the computation in deep networks with multiple hidden layers. We evaluate SVMs with these kernels on problems designed to illustrate the advantages of deep architectures. Compared to previous benchmarks, we find that on some problems, these SVMs yield state-of-the-art results, beating not only other SVMs but also deep belief nets.