Gradient-Based Adaptation of General Gaussian Kernels

  • Authors:
  • Tobias Glasmachers;Christian Igel

  • Affiliations:
  • Institut für Neuroinformatik, Ruhr-Universität Bochum, D-44780 Bochum, Germany;Institut für Neuroinformatik, Ruhr-Universität Bochum, D-44780 Bochum, Germany

  • Venue:
  • Neural Computation
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.