An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning with the Set Covering Machine
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Similar Classifiers and VC Error Bounds
Similar Classifiers and VC Error Bounds
A Compression Approach to Support Vector Model Selection
The Journal of Machine Learning Research
Tutorial on Practical Prediction Theory for Classification
The Journal of Machine Learning Research
Learning with Decision Lists of Data-Dependent Features
The Journal of Machine Learning Research
Validation of network classifiers
SSPR'12/SPR'12 Proceedings of the 2012 Joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
Hi-index | 0.00 |
This paper develops bounds on out-of-sample error rates for support vector machines (SVMs). The bounds are based on the numbers of support vectors in the SVMs rather than on VC dimension. The bounds developed here improve on support vector counting bounds derived using Littlestone and Warmuth's compression-based bounding technique.