An implicit-function theorem for a class of nonsmooth functions
Mathematics of Operations Research
Local convergence of quasi-Newton methods for B-differentiable equations
Mathematical Programming: Series A and B
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
On Robustness Properties of Convex Risk Minimization Methods for Pattern Recognition
The Journal of Machine Learning Research
Nonparametric Quantile Estimation
The Journal of Machine Learning Research
Support Vector Machines
Robustness and Regularization of Support Vector Machines
The Journal of Machine Learning Research
On qualitative robustness of support vector machines
Journal of Multivariate Analysis
Consistency of support vector machines using additive kernels for additive models
Computational Statistics & Data Analysis
Asymptotic normality of support vector machine variants and other regularized kernel methods
Journal of Multivariate Analysis
Hi-index | 0.00 |
We investigate robustness properties for a broad class of support vector machines with non-smooth loss functions. These kernel methods are inspired by convex risk minimization in infinite dimensional Hilbert spaces. Leading examples are the support vector machine based on the ε-insensitive loss function, and kernel based quantile regression based on the pinball loss function. Firstly, we propose with the Bouligand influence function (BIF) a modification of F.R. Hampel's influence function. The BIF has the advantage of being positive homogeneous which is in general not true for Hampel's influence function. Secondly, we show that many support vector machines based on a Lipschitz continuous loss function and a bounded kernel have a bounded BIF and are thus robust in the sense of robust statistics based on influence functions.