Fundamentals of matrix computations
Fundamentals of matrix computations
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Introduction to support vector learning
Advances in kernel methods
Making large-scale support vector machine learning practical
Advances in kernel methods
Pairwise classification and support vector machines
Advances in kernel methods
The bias-variance tradeoff and the randomized GACV
Proceedings of the 1998 conference on Advances in neural information processing systems II
Machine Learning
Adaptive Regularization in Neural Network Modeling
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
KMOD " A Tw o-Parameter SVM Kernel for Pattern Recognition
ICPR '02 Proceedings of the 16 th International Conference on Pattern Recognition (ICPR'02) Volume 3 - Volume 3
ICDAR '01 Proceedings of the Sixth International Conference on Document Analysis and Recognition
Gradient-Based Optimization of Hyperparameters
Neural Computation
An overview of statistical learning theory
IEEE Transactions on Neural Networks
A trainable feature extractor for handwritten digit recognition
Pattern Recognition
Fast and efficient strategies for model selection of Gaussian support vector machine
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Automatic model selection for the optimization of SVM kernels
Pattern Recognition
ACIVS'07 Proceedings of the 9th international conference on Advanced concepts for intelligent vision systems
Extended Bayesian framework for automatic tuning of kernel data-mining methods
ACS'06 Proceedings of the 6th WSEAS international conference on Applied computer science
The use of stability principle for kernel determination in relevance vector machines
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Hi-index | 0.00 |
We address the problem of optimizing kernel parameters in Support Vector Machine modelling, especially when the number of parameters is greater than one as in polynomial kernels and KMOD, our newly introduced kernel. The present work is an extended experimental study of the framework proposed by Chapelle et al. for optimizing SVM kernels using an analytic upper bound of the error. However, our optimization scheme minimizes an empirical error estimate using a Quasi-Newton technique. The method has shown to reduce the number of support vectors along the optimization process. In order to assess our contribution, the approach is further used for adapting KMOD, RBF and polynomial kernels on synthetic data and NIST digit image database. The method has shown satisfactory results with much faster convergence in comparison with the simple gradient descent method.Furthermore, we also experimented two more optimization schemes based respectively on the maximization of the margin and on the minimization of an approximated VC dimension estimate. While both of the objective functions are minimized, the error is not. The corresponding experimental results we carried out show this shortcoming.