The covering number in learning theory
Journal of Complexity
Leave-one-out bounds for kernel methods
Neural Computation
A note on different covering numbers in learning theory
Journal of Complexity
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Learning Rates of Least-Square Regularized Regression
Foundations of Computational Mathematics
Optimal Rates for the Regularized Least-Squares Algorithm
Foundations of Computational Mathematics
Estimation of a regression function by maxima of minima of linear functions
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Covering numbers for support vector machines
IEEE Transactions on Information Theory
Capacity of reproducing kernel spaces in learning theory
IEEE Transactions on Information Theory
Universal learning using free multivariate splines
Neurocomputing
Hi-index | 0.04 |
In this paper, regression problem in learning theory is investigated by least square schemes in polynomial space. Results concerning the estimation of rate of convergence are derived. In particular, it is shown that for one variable smooth regression function, the estimation is able to achieve good rate of convergence. As a main tool in the study, the Jackson operator in approximation theory is used to estimate the rate. Finally, the obtained estimation is illustrated by applying simulated data.