A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
A Methodology for the Statistical Characterization of Genetic Algorithms
MICAI '02 Proceedings of the Second Mexican International Conference on Artificial Intelligence: Advances in Artificial Intelligence
Neural Networks: A Comprehensive Foundation (3rd Edition)
Neural Networks: A Comprehensive Foundation (3rd Edition)
Hi-index | 0.00 |
We present an algorithm which is able to adjust a set of data with unknown characteristics. The algorithm is based on the simple notion that a sufficiently large set of examples will adequately embody the essence of a numerically expressible phenomenon, on the one hand, and, on the other, that it is possible to synthesize such essence via a polynomial of the form F(V1,..., Vn) = ΣI1=0D1... ΣIn=0Dn µI1...In CI1...In VI1D1...VInIn where the µI1...In may only take the values 0 or 1 depending on whether the corresponding monomial is adequate. In order to determine the adequateness of the monomial in case we resort to a genetic algorithm which minimizes the fitting error of the candidate polynomials. We analyze a set of selected data sets and find the best approximation polynomial. We compare our results with those stemming from multi-layer perceptron networks trained with the well known backpropagation algorithm (BMLPs). We show that our Genetic Mutivariate Plynomials (GMPs) compare favorably with the corresponding BMLPs without the "black box" characteristic of the latter; a frequently cited disadvantage. We also discuss the minimization (genetic) algorithm (GA).