Model Selection and Error Estimation
Machine Learning
Quantum optimization for training support vector machines
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Trading convexity for scalability
ICML '06 Proceedings of the 23rd international conference on Machine learning
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Maximal Discrepancy for Support Vector Machines
Neurocomputing
Hi-index | 0.00 |
In this paper, we target the problem of model selection for Support Vector Classifiers through in–sample methods, which are particularly appealing in the small–sample regime, i.e. when few high–dimensional patterns are available. In particular, we describe the application of a trimmed hinge loss function to Rademacher Complexity and Maximal Discrepancy based in–sample approaches. We also show that the selected classifiers outperform the ones obtained with other state-of-the-art in-sample and out–of–sample model selection techniques in classifying Human Gene Expression datasets.