The nature of statistical learning theory
The nature of statistical learning theory
An equivalence between sparse approximation and support vector machines
Neural Computation
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
The Journal of Machine Learning Research
Sparseness of support vector machines
The Journal of Machine Learning Research
On Robustness Properties of Convex Risk Minimization Methods for Pattern Recognition
The Journal of Machine Learning Research
Some Properties of Regularized Kernel Methods
The Journal of Machine Learning Research
Support Vector Machines
Almost-everywhere algorithmic stability and generalization error
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Asymmetric least squares support vector machine classifiers
Computational Statistics & Data Analysis
Hi-index | 0.00 |
Kernel Based Regression (KBR) minimizes a convex risk over a possibly infinite dimensional reproducing kernel Hilbert space. Recently, it was shown that KBR with a least squares loss function may have some undesirable properties from a robustness point of view: even very small amounts of outliers can dramatically affect the estimates. KBR with other loss functions is more robust, but often gives rise to more complicated computations (e.g. for Huber or logistic losses). In classical statistics robustness is often improved by reweighting the original estimate. In this paper we provide a theoretical framework for reweighted Least Squares KBR (LS-KBR) and analyze its robustness. Some important differences are found with respect to linear regression, indicating that LS-KBR with a bounded kernel is much more suited for reweighting. In two special cases our results can be translated into practical guidelines for a good choice of weights, providing robustness as well as fast convergence. In particular a logistic weight function seems an appropriate choice, not only to downweight outliers, but also to improve performance at heavy tailed distributions. For the latter some heuristic arguments are given comparing concepts from robustness and stability.