Nonsmooth analysis and control theory
Nonsmooth analysis and control theory
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Are loss functions all the same?
Neural Computation
Some Properties of Regularized Kernel Methods
The Journal of Machine Learning Research
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
Learning Theory: An Approximation Theory Viewpoint (Cambridge Monographs on Applied & Computational Mathematics)
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
On the Representer Theorem and Equivalent Degrees of Freedom of SVR
The Journal of Machine Learning Research
Consistency of kernel-based quantile regression
Applied Stochastic Models in Business and Industry
Learning with sample dependent hypothesis spaces
Computers & Mathematics with Applications
Elastic-net regularization in learning theory
Journal of Complexity
Analysis of Support Vector Machines Regression
Foundations of Computational Mathematics
Kernel Conditional Quantile Estimation via Reduction Revisited
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
When Is There a Representer Theorem? Vector Versus Matrix Regularizers
The Journal of Machine Learning Research
Semi-supervised learning based on high density region estimation
Neural Networks
Hi-index | 0.00 |
This paper considers the error bounds for the coefficient regularized regression schemes associated with Lipschitz loss. Our main goal is to study the convergence rates for this algorithm with non-smooth analysis. We give an explicit expression of the solution with generalized gradients of the loss which induces a capacity independent bound for the sample error. A kind of approximation error is provided with possibility theory.