Variants of unsupervised kernel regression: General cost functions

  • Authors:
  • Stefan Klanke;Helge Ritter

  • Affiliations:
  • Neuroinformatics Group, Faculty of Technology, University of Bielefeld, P.O. Box 10 01 31, 33501 Bielefeld, Germany;Neuroinformatics Group, Faculty of Technology, University of Bielefeld, P.O. Box 10 01 31, 33501 Bielefeld, Germany

  • Venue:
  • Neurocomputing
  • Year:
  • 2007

Quantified Score

Hi-index 0.03

Visualization

Abstract

We present an extension to unsupervised kernel regression (UKR), a recent method for learning of nonlinear manifolds, which can utilize leave-one-out cross-validation as an automatic complexity control without additional computational cost. Our extension allows us to incorporate general cost functions, by which the UKR algorithm can be made more robust or be tuned to specific noise models. We focus on Huber's loss and on the @e-insensitive loss, which we present together with a practical optimization approach. We demonstrate our method on both toy and real data.