Convex analysis and variational problems
Convex analysis and variational problems
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
The Journal of Machine Learning Research
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
Foundations of Computational Mathematics
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
Bayesian approach, theory of empirical risk minimization. Comparative analysis
Cybernetics and Systems Analysis
Optimization and Knowledge-Based Technologies
Informatica
Asymptotic efficiency of kernel support vector machines (SVM)
Cybernetics and Systems Analysis
Hi-index | 0.00 |
The paper studies stochastic optimization problems in Reproducing Kernel Hilbert Spaces (RKHS). The objective function of such problems is a mathematical expectation functional depending on decision rules (or strategies), i.e. on functions of observed random parameters. Feasible rules are restricted to belong to a RKHS. This kind of problems arises in on-line decision making and in statistical learning theory. We solve the problem by sample average approximation combined with Tihonov's regularization and establish sufficient conditions for uniform convergence of approximate solutions with probability one, jointly with a rule for downward adjustment of the regularization factor with increasing sample size.