An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
The covering number in learning theory
Journal of Complexity
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
Neural Computation
Learning Rates of Least-Square Regularized Regression
Foundations of Computational Mathematics
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Multi-kernel regularized classifiers
Journal of Complexity
Classification with Gaussians and Convex Loss
The Journal of Machine Learning Research
On complexity issues of online learning algorithms
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Capacity of reproducing kernel spaces in learning theory
IEEE Transactions on Information Theory
Hi-index | 0.09 |
This paper is a continuation of the study of classification learning algorithms generated by regularization schemes associated with Gaussian kernels and general convex loss functions. In previous papers Xiang and Zhou (2009) [5], Xiang (2010) [7], it is assumed that the convex loss @f has a zero. This excludes some useful loss functions without zero such as the logistic loss @?(t)=log(1+exp(-t)). The main purpose of this paper is to conduct error analysis for the classification learning algorithms associated with such loss functions. The learning rates are derived by a novel application of projection operators to overcome the technical difficulty.