The nature of statistical learning theory
The nature of statistical learning theory
An equivalence between sparse approximation and support vector machines
Neural Computation
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Training v-support vector regression: theory and algorithms
Neural Computation
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
Neural Computation
MMCS'08 Proceedings of the 7th international conference on Mathematical Methods for Curves and Surfaces
Full length article: Interpolation and approximation in Taylor spaces
Journal of Approximation Theory
Hi-index | 0.00 |
We introduce a new technique for the analysis of kernel-based regression problems. The basic tools are sampling inequalities which apply to all machine learning problems involving penalty terms induced by kernels related to Sobolev spaces. They lead to explicit deterministic results concerning the worst case behaviour of ε- and ν-SVRs. Using these, we show how to adjust regularization parameters to get best possible approximation orders for regression. The results are illustrated by some numerical examples.