Optimization of control parameters for genetic algorithms
IEEE Transactions on Systems, Man and Cybernetics
An adaptive crossover distribution mechanism for genetic algorithms
Proceedings of the Second International Conference on Genetic Algorithms on Genetic algorithms and their application
Self-adaptation in evolving systems
Artificial Life
Michael Abrash's Graphics Programming Black Book, with CD: The Complete Works of Graphics Master, Michael Abrash
Journal of Global Optimization
Optimal Mutation Rates in Genetic Search
Proceedings of the 5th International Conference on Genetic Algorithms
Parameter Selection in Particle Swarm Optimization
EP '98 Proceedings of the 7th International Conference on Evolutionary Programming VII
EP '98 Proceedings of the 7th International Conference on Evolutionary Programming VII
Introduction to Evolutionary Computing
Introduction to Evolutionary Computing
A comprehensive survey of fitness approximation in evolutionary computation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Journal of Global Optimization
Conditions that Obviate the No-Free-Lunch Theorems for Optimization
INFORMS Journal on Computing
A genetic algorithm that adaptively mutates and never revisits
IEEE Transactions on Evolutionary Computation
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Evolutionary programming made faster
IEEE Transactions on Evolutionary Computation
A framework for evolutionary optimization with approximate fitnessfunctions
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
In solving problems with evolutionary algorithms (EAs), the performance of the EA will be affected by its properties. As the properties of EA depend on the parameter setting, users need to tune the parameters to optimize the performance on different problems. In the case that the user does not have any prior knowledge of the problem, parameter tuning is very difficult and time consuming. One needs to try different combinations of parameter values to find the best setting. To solve this problem, one way is to control the parameters during the EA run. This paper proposes a new adaptive parameter control system, called Parameter Control system using entire Search History (PCSH). It is a general add-on system which is not restricted to a specific class of EA. Users are only required to know the range of the parameters. It automatically adjusts the parameters of an EA according to the entire search history, in a parameter-less manner. To illustrate the performance of PCSH, it is applied to control the parameters of three common classes of EAs: (1) canonical Genetic Algorithm (GA), (2) Particle Swarm Optimization (PSO) and (3) Differential Evolution (DE). For GA, we show that PCSH can automatically control the crossover operator, crossover values (uniformly sampled from the range) and mutation operator. For DE, we show that PCSH can automatically control the crossover operator, crossover values and the differential amplification factor (uniformly sampled from the ranges). For PSO, we show that PCSH can automatically control the two learning factors and the inertia weight (uniformly sampled from the range). Moreover, no special provision is needed at the initialization. 34 benchmark functions are used to evaluate the performance comprehensively. The test results show that, in most of the benchmark functions, the performance of the test EAs are improved or similar after adopting PCSH. It shows that PCSH keeps or improves the performance of the test EAs while relieving the heavy burden of the algorithm designer on the setting of some parameters.