An implicit-function theorem for a class of nonsmooth functions
Mathematics of Operations Research
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
On the influence of the kernel on the consistency of support vector machines
The Journal of Machine Learning Research
Diffusion Kernels on Statistical Manifolds
The Journal of Machine Learning Research
Computational Statistics & Data Analysis
Optimal Rates for the Regularized Least-Squares Algorithm
Foundations of Computational Mathematics
Bouligand Derivatives and Robustness of Support Vector Machines for Regression
The Journal of Machine Learning Research
Boosting additive models using component-wise P-Splines
Computational Statistics & Data Analysis
Additive prediction and boosting for functional data
Computational Statistics & Data Analysis
Boosting nonlinear additive autoregressive time series
Computational Statistics & Data Analysis
Ensemble classification based on generalized additive models
Computational Statistics & Data Analysis
On qualitative robustness of support vector machines
Journal of Multivariate Analysis
Any Discrimination Rule Can Have an Arbitrarily Bad Probability of Error for Finite Sample Size
IEEE Transactions on Pattern Analysis and Machine Intelligence
Editorial for the special issue on quantile regression and semiparametric methods
Computational Statistics & Data Analysis
Universal consistency of localized versions of regularized kernel methods
The Journal of Machine Learning Research
Hi-index | 0.03 |
Support vector machines (SVMs) are special kernel based methods and have been among the most successful learning methods for more than a decade. SVMs can informally be described as kinds of regularized M-estimators for functions and have demonstrated their usefulness in many complicated real-life problems. During the last few years a great part of the statistical research on SVMs has concentrated on the question of how to design SVMs such that they are universally consistent and statistically robust for nonparametric classification or nonparametric regression purposes. In many applications, some qualitative prior knowledge of the distribution P or of the unknown function f to be estimated is present or a prediction function with good interpretability is desired, such that a semiparametric model or an additive model is of interest. The question of how to design SVMs by choosing the reproducing kernel Hilbert space (RKHS) or its corresponding kernel to obtain consistent and statistically robust estimators in additive models is addressed. An explicit construction of such RKHSs and their kernels, which will be called additive kernels, is given. SVMs based on additive kernels will be called additive support vector machines. The use of such additive kernels leads, in combination with a Lipschitz continuous loss function, to SVMs with the desired properties for additive models. Examples include quantile regression based on the pinball loss function, regression based on the @e-insensitive loss function, and classification based on the hinge loss function.