Covariance Matrix Self-Adaptation and Kernel Regression - Perspectives of Evolutionary Optimization in Kernel Machines

  • Authors:
  • Oliver Kramer

  • Affiliations:
  • Technische Universität Dortmund, Department of Computer Science, Algorithm Engineering/Computational Intelligence (LS XI), Otto-Hahn-Str. 14, 44221 Dortmund, Germany. E-mail: oliver.kramer@tu ...

  • Venue:
  • Fundamenta Informaticae - Intelligent Data Analysis in Granular Computing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Kernel based techniques have shown outstanding success in data mining and machine learning in the recent past. Many optimization problems of kernel based methods suffer from multiple local optima. Evolution strategies have grown to successfulmethods in non-convex optimization. This work shows how both areas can profit from each other. We investigate the application of evolution strategies to Nadaraya-Watson based kernel regression and vice versa. The Nadaraya-Watson estimator is used as meta-model during optimization with the covariance matrix self-adaptation evolution strategy. An experimental analysis evaluates the meta-model assisted optimization process on a set of test functions and investigates model sizes and the balance between objective function evaluations on the real function and on the surrogate. In turn, evolution strategies can be used to optimize the embedded optimization problem of unsupervised kernel regression. The latter is fairly parameter dependent, and minimization of the data space reconstruction error is an optimization problem with numerous local optima. We propose an evolution strategy based unsupervised kernel regression method to solve the embedded learning problem. Furthermore, we tune the novel method by means of the parameter tuning technique sequential parameter optimization.