Nonlinear functional analysis and its applications
Nonlinear functional analysis and its applications
Robust trainability of single neurons
Journal of Computer and System Sciences
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Making large-scale support vector machine learning practical
Advances in kernel methods
Principles of data mining
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Support vector machines are universally consistent
Journal of Complexity
On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
The Journal of Machine Learning Research
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Sparseness of support vector machines
The Journal of Machine Learning Research
On the rate of convergence of regularized boosting classifiers
The Journal of Machine Learning Research
Measuring overlap in binary regression
Computational Statistics & Data Analysis
Risk-sensitive loss functions for sparse multi-category classification problems
Information Sciences: an International Journal
Bouligand Derivatives and Robustness of Support Vector Machines for Regression
The Journal of Machine Learning Research
Robustness of reweighted Least Squares Kernel Based Regression
Journal of Multivariate Analysis
Robustness and Regularization of Support Vector Machines
The Journal of Machine Learning Research
Detecting influential observations in Kernel PCA
Computational Statistics & Data Analysis
Support Vector Machines with the Ramp Loss and the Hard Margin Loss
Operations Research
Asymptotic normality of support vector machine variants and other regularized kernel methods
Journal of Multivariate Analysis
Asymmetric least squares support vector machine classifiers
Computational Statistics & Data Analysis
Hi-index | 0.00 |
The paper brings together methods from two disciplines: machine learning theory and robust statistics. We argue that robustness is an important aspect and we show that many existing machine learning methods based on the convex risk minimization principle have - besides other good properties - also the advantage of being robust. Robustness properties of machine learning methods based on convex risk minimization are investigated for the problem of pattern recognition. Assumptions are given for the existence of the influence function of the classifiers and for bounds on the influence function. Kernel logistic regression, support vector machines, least squares and the AdaBoost loss function are treated as special cases. Some results on the robustness of such methods are also obtained for the sensitivity curve and the maxbias, which are two other robustness criteria. A sensitivity analysis of the support vector machine is given.