What Makes a Problem Hard for a Genetic Algorithm? Some Anomalous Results and Their Explanation
Machine Learning - Special issue on genetic algorithms
An introduction to genetic algorithms
An introduction to genetic algorithms
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
An Analysis of the Effects of Neighborhood Size and Shape on Local Selection Algorithms
PPSN IV Proceedings of the 4th International Conference on Parallel Problem Solving from Nature
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Schema processing under proportional selection in the presence ofrandom effects
IEEE Transactions on Evolutionary Computation
Machine learning with genetic multivariate polynomials
AIKED'06 Proceedings of the 5th WSEAS International Conference on Artificial Intelligence, Knowledge Engineering and Data Bases
Clustering with an N-dimensional extension of Gielis superformula
AIKED'08 Proceedings of the 7th WSEAS International Conference on Artificial intelligence, knowledge engineering and data bases
A methodology to find clusters in the data based on Shannon's entropy and genetic algorithms
ACELAE'11 Proceedings of the 10th WSEAS international conference on communications, electrical & computer engineering, and 9th WSEAS international conference on Applied electromagnetics, wireless and optical communications
CIARP'06 Proceedings of the 11th Iberoamerican conference on Progress in Pattern Recognition, Image Analysis and Applications
Genetic multivariate polynomials: an alternative tool to neural networks
CIARP'05 Proceedings of the 10th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis and Applications
IBERAMIA-SBIA'06 Proceedings of the 2nd international joint conference, and Proceedings of the 10th Ibero-American Conference on AI 18th Brazilian conference on Advances in Artificial Intelligence
MICAI'12 Proceedings of the 11th Mexican international conference on Advances in Computational Intelligence - Volume Part II
Hi-index | 0.00 |
The inherent complexity of the Genetic Algorithms (GAs) has led to various theoretical an experimental approaches whose ultimate goal is to better understand the dynamics of such algorithms. Through such understanding, it is hoped, we will be able to improve their efficiency. Experiments, typically, explore the GA's behavior by testing them versus a set of functions with characteristics deemed adequate. In this paper we present a methodology which aims at achieving a solid relative evaluation of alternative GAs by resorting to statistical arguments. With it we may categorize any iterative optimization algorithm by statistically finding the basic parameters of the probability distribution of the GA's optimum values without resorting to a priori functions. We analyze the behavior of 6 algorithms (5 variations of a GA and a hill climber) which we characterize and compare. We make some remarks regarding the relation between statistical studies such as ours and the well known "No Free Lunch Theorem".