Measuring the VC-dimension of a learning machine
Neural Computation
Machine Learning
Evaluating the Generalization Ability of Support Vector Machines through the Bootstrap
Neural Processing Letters
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Convergence of a Generalized SMO Algorithm for SVM Classifier Design
Machine Learning
Model Selection and Error Estimation
Machine Learning
Measuring the VC-Dimension Using Optimized Experimental Design
Neural Computation
K-winner machines for pattern classification
IEEE Transactions on Neural Networks
On the Generalization Ability of GRLVQ Networks
Neural Processing Letters
Hi-index | 0.00 |
This work describes the application of the Maximal Discrepancy (MD) criterion to the process of hyperparameter setting in SVMs and points out the advantages of such an approach over existing theoretical and practical frameworks.The resulting theoretical predictions are compared with a k-fold cross-validation empirical method on some benchmark datasets showing that the MD technique can be used for automatic SVM model selection.