The covering number in learning theory
Journal of Complexity
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Learning Rates of Least-Square Regularized Regression
Foundations of Computational Mathematics
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Multi-kernel regularized classifiers
Journal of Complexity
Optimal Rates for the Regularized Least-Squares Algorithm
Foundations of Computational Mathematics
Elastic-net regularization in learning theory
Journal of Complexity
Journal of Approximation Theory
Hermite type moving-least-squares approximations
Computers & Mathematics with Applications
Moving least-square method in learning theory
Journal of Approximation Theory
Optimal learning rates for least squares regularized regression with unbounded sampling
Journal of Complexity
Full length article: Concentration estimates for the moving least-square method in learning theory
Journal of Approximation Theory
Information Sciences: an International Journal
Capacity of reproducing kernel spaces in learning theory
IEEE Transactions on Information Theory
Information Sciences: an International Journal
Concentration estimates for learning with unbounded sampling
Advances in Computational Mathematics
Learning theory approach to minimum error entropy criterion
The Journal of Machine Learning Research
Hi-index | 0.07 |
Moving least-squares method is investigated with samples drawn from unbounded sampling processes. Convergence analysis is established by imposing incremental conditions on moments of sample output and window width. Satisfied convergence rates are derived by means of projection operator and some concentration inequalities.