The Design of Innovation: Lessons from and for Competent Genetic Algorithms
The Design of Innovation: Lessons from and for Competent Genetic Algorithms
A Survey of Optimization by Building and Using Probabilistic Models
Computational Optimization and Applications
Bayesian optimization algorithm: from single level to hierarchy
Bayesian optimization algorithm: from single level to hierarchy
Gene Expression and Fast Construction of Distributed Evolutionary Representation
Evolutionary Computation
Building Blocks, Cohort Genetic Algorithms, and Hyperplane-Defined Functions
Evolutionary Computation
Scalability problems of simple genetic algorithms
Evolutionary Computation
Linkage identification by non-monotonicity detection for overlapping functions
Evolutionary Computation
Real options approach to evaluating genetic algorithms
Applied Soft Computing
Direct and explicit building blocks identification and composition algorithm
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Hi-index | 0.00 |
This paper presents a line of research in genetic algorithms (GAs), called building-block identification. The building blocks (BBs) are common structures inferred from a set of solutions. In simple GA, crossover operator plays an important role in mixing BBs. However, the crossover probably disrupts the BBs because the cut point is chosen at random. Therefore the BBs need to be identified explicitly so that the solutions are efficiently mixed. Let S be a set of binary solutions and the solution s = b1 ... bℓ, bi ∈ {0, 1}. We construct a symmetric matrix of which the element in row i and column j, denoted by mij, is the chi-square of variables bi and bj. The larger the mij is, the higher the dependency is between bit i and bit j. If mij is high, bit i and bit j should be passed together to prevent BB disruption. Our approach is validated for additively decomposable functions (ADFs) and hierarchically decomposable functions (HDFs). In terms of scalability, our approach shows a polynomial relationship between the number of function evaluations required to reach the optimum and the problem size. A comparison between the chi-square matrix and the hierarchical Bayesian optimization algorithm (hBOA) shows that the matrix computation is 10 times faster and uses 10 times less memory than constructing the Bayesian network.