An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Convex analysis and variational problems
Convex analysis and variational problems
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
The Journal of Machine Learning Research
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Foundations of Computational Mathematics
Method of empirical means in stochastic programming problems
Cybernetics and Systems Analysis
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
Optimal pattern recognition procedures and their application
Cybernetics and Systems Analysis
Hi-index | 0.00 |
The paper analyzes the asymptotic properties of Vapnik's SVM-estimates of a regression function as the size of the training sample tends to infinity. The estimation problem is considered as infinite-dimensional minimization of a regularized empirical risk functional in a reproducing kernel Hilbert space. The rate of convergence of the risk functional on SVM-estimates to its minimum value is established. The sufficient conditions for the uniform convergence of SVM-estimates to a true regression function with unit probability are given.