What size net gives valid generalization?
Neural Computation
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Neural Computation
A universal theorem on learning curves
Neural Networks
Statistical theory of learning curves under entropic loss criterion
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
An asymptotic statistical theory of polynomial kernel methods
Neural Computation
Geometry and learning curves of kernel methods with polynomial kernels
Systems and Computers in Japan
Neural Computation
Generalization error analysis for polynomial kernel methods: algebraic geometrical approach
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Geometrical Properties of Nu Support Vector Machines with Different Norms
Neural Computation
Expert Systems with Applications: An International Journal
A support vector machine with forgetting factor and its statistical properties
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
Hi-index | 0.00 |
The generalization properties of support vector machines (SVMs) are examined. From a geometrical point of view, the estimated parameter of an SVM is the one nearest the origin in the convex hull formed with given examples. Since introducing soft margins is equivalent to reducing the convex hull of the examples, an SVM with soft margins has a different learning curve from the original. In this paper we derive the asymptotic average generalization error of SVMs with soft margins in simple cases, that is, only when the dimension of inputs is one, and quantitatively show that soft margins increase the generalization error. r.