The nature of statistical learning theory
The nature of statistical learning theory
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Bounds on Error Expectation for Support Vector Machines
Neural Computation
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
An Estimation of the Optimal Gaussian Kernel Parameter for Support Vector Classification
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
The Knowledge Engineering Review
Hi-index | 0.00 |
We propose an algorithm to predict the leave-one-out (LOO) error for kernel based classifiers. To achieve this goal with computational efficiency, we cast the LOO error approximation task into a classification problem. This means that we need to learn a classification of whether or not a given training sample - if left out of the data set - would be misclassified. For this learning task, simple data dependent features are proposed, inspired by geometrical intuition. Our approach allows to reliably select a good model as demonstrated in simulations on Support Vector and Linear Programming Machines. Comparisons to existing learning theoretical bounds, e.g. the span bound, are given for various model selection scenarios.