The nature of statistical learning theory
The nature of statistical learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
On the Noise Model of Support Vector Machine Regression
On the Noise Model of Support Vector Machine Regression
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Model Selection for Regularized Least-Squares Algorithm in Learning Theory
Foundations of Computational Mathematics
SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
Neural Computation
Learning Rates of Least-Square Regularized Regression
Foundations of Computational Mathematics
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Learning with sample dependent hypothesis spaces
Computers & Mathematics with Applications
Analysis of Support Vector Machines Regression
Foundations of Computational Mathematics
Decoding by linear programming
IEEE Transactions on Information Theory
Hi-index | 0.00 |
The classical support vector machines regression (SVMR) is known as a regularized learning algorithm in reproducing kernel Hilbert spaces (RKHS) with a @e-insensitive loss function and an RKHS norm regularizer. In this paper, we study a new SVMR algorithm where the regularization term is proportional to l^1-norm of the coefficients in the kernel ensembles. We provide an error analysis of this algorithm, an explicit learning rate is then derived under some assumptions.