Advances in neural information processing systems 2
Generalization by weight-elimination with application to forecasting
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Neural networks and the bias/variance dilemma
Neural Computation
C4.5: programs for machine learning
C4.5: programs for machine learning
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
A note on comparing classifiers
Pattern Recognition Letters
Mixtures of probabilistic principal component analyzers
Neural Computation
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
SVM Classification Using Sequences of Phonemes and Syllables
PKDD '02 Proceedings of the 6th European Conference on Principles of Data Mining and Knowledge Discovery
A Comparative Study of Polynomial Kernel SVM Applied to Appearance-Based Object Recognition
SVM '02 Proceedings of the First International Workshop on Pattern Recognition with Support Vector Machines
Accurate on-line support vector regression
Neural Computation
Statistics for Engineering and the Sciences (5th Edition)
Statistics for Engineering and the Sciences (5th Edition)
Multi-objective uniform design as a SVM model selection tool for face recognition
Expert Systems with Applications: An International Journal
ICCSA'12 Proceedings of the 12th international conference on Computational Science and Its Applications - Volume Part III
Hi-index | 0.00 |
The key challenge in kernel based learning algorithms is the choice of an appropriate kernel and its optimal parameters. Selecting the optimal degree of a polynomial kernel is critical to ensure good generalisation of the resulting support vector machine model. In this paper we propose Bayesian and Laplace approximation methods to estimate the polynomial degree. A rule based meta-learning approach is then proposed for automatic polynomial kernel and its optimal degree selection. The new approach is constructed and tested on different sizes of 112 datasets with binary class as well as multi class classification problems. An extensive computational evaluation of these methods is conducted, and rules are generated to determine when these approximation methods are appropriate.