From regularization operators to support vector kernels
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Dynamically adapting kernels in support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Using the Fisher Kernel Method to Detect Remote Protein Homologies
Proceedings of the Seventh International Conference on Intelligent Systems for Molecular Biology
SVMTorch: support vector machines for large-scale regression problems
The Journal of Machine Learning Research
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
SVM Based Decision Analysis and Its Granular-Based Solving
ICCSA '09 Proceedings of the International Conference on Computational Science and Its Applications: Part II
ICCSA'12 Proceedings of the 12th international conference on Computational Science and Its Applications - Volume Part III
Hi-index | 0.00 |
The Support Vector Machine (SVM) algorithm is sensitive to the choice of parameter settings, which makes it hard to use by non-experts. It has been shown that meta-learning can be used to support the selection of SVM parameter values. Previous approaches have used general statistical measures as meta-features. Here we propose a new set of meta-features that are based on the kernel matrix. We test them on the problem of setting the width of the Gaussian kernel for regression problems. We obtain significant improvements in comparison to earlier meta-learning results. We expect that with better support in the selection of parameter values, SVM becomes accessible to a wider range of users.