A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
A Methodology for the Statistical Characterization of Genetic Algorithms
MICAI '02 Proceedings of the Second Mexican International Conference on Artificial Intelligence: Advances in Artificial Intelligence
Genetic programming for the prediction of insolvency in non-life insurance companies
Computers and Operations Research
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Support vector machines for histogram-based image classification
IEEE Transactions on Neural Networks
MP-polynomial kernel for training support vector machines
CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
Hi-index | 0.00 |
We discuss how algebraic explicit expressions modeling a complex phenomenon via an adequate set of data can be derived from the application of Genetic Multivariate Polynomials (GMPs), on the one hand, and Support Vector Machines (SVMs) on the other. A polynomial expression is derived in GMPs in a natural way, whereas in SVMs a polynomial kernel is employed to derive a similar one. In any particular problem an evolutionary determined sample of monomials is required in GMP expressions while, on the other hand, there is a large number of monomials implicit in the SVM approach. We make some experiments to compare the modeling characterization and accuracy obtained from the application of both methods.