Support vector machines are universally consistent
Journal of Complexity
The Journal of Machine Learning Research
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Foundations of Computational Mathematics
IEEE Transactions on Signal Processing
Online Regularized Classification Algorithms
IEEE Transactions on Information Theory
Worst-case quadratic loss bounds for prediction using linear functions and gradient descent
IEEE Transactions on Neural Networks
Hi-index | 7.29 |
In this paper, a stochastic gradient descent algorithm is proposed for the binary classification problems based on general convex loss functions. It has computational superiority over the existing algorithms when the sample size is large. Under some reasonable assumptions on the hypothesis space and the underlying distribution, the learning rate of the algorithm has been established, which is faster than that of closely related algorithms.