The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Evolution strategies –A comprehensive introduction
Natural Computing: an international journal
Locally Weighted Projection Regression: Incremental Real Time Learning in High Dimensional Space
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Regularized principal manifolds
The Journal of Machine Learning Research
A tutorial on support vector regression
Statistics and Computing
Principal Surfaces from Unsupervised Kernel Regression
IEEE Transactions on Pattern Analysis and Machine Intelligence
Evolutionary learning with kernels: a generic solution for large margin problems
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
The Journal of Machine Learning Research
Robust locally linear embedding
Pattern Recognition
Evolutionary Support Vector Regression Machines
SYNASC '06 Proceedings of the Eighth International Symposium on Symbolic and Numeric Algorithms for Scientific Computing
A derandomized approach to self-adaptation of evolution strategies
Evolutionary Computation
Covariance Matrix Adaptation Revisited --- The CMSA Evolution Strategy ---
Proceedings of the 10th international conference on Parallel Problem Solving from Nature: PPSN X
Fast evolutionary maximum margin clustering
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Hi-index | 12.05 |
The Nadaraya-Watson estimator, also known as kernel regression, is a density-based regression technique. It weights output values with the relative densities in input space. The density is measured with kernel functions that depend on bandwidth parameters. In this work we present an evolutionary bandwidth optimizer for kernel regression. The approach is based on a robust loss function, leave-one-out cross-validation, and the CMSA-ES as optimization engine. A variant with local parameterized Nadaraya-Watson models enhances the approach, and allows the adaptation of the model to local data space characteristics. The unsupervised counterpart of kernel regression is an approach to learn principal manifolds. The learning problem of unsupervised kernel regression (UKR) is based on optimizing the latent variables, which is a multimodal problem with many local optima. We propose an evolutionary framework for optimization of UKR based on scaling of initial local linear embedding solutions, and minimization of the cross-validation error. Both methods are analyzed experimentally.