Tabu Search
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Introduction to Evolutionary Computing
Introduction to Evolutionary Computing
Dissipative particle swarm optimization
CEC '02 Proceedings of the Evolutionary Computation on 2002. CEC '02. Proceedings of the 2002 Congress - Volume 02
An Improved Particle Swarm Optimization with Mutation Based on Similarity
ICNC '07 Proceedings of the Third International Conference on Natural Computation - Volume 04
Parameter Setting in Evolutionary Algorithms
Parameter Setting in Evolutionary Algorithms
A genetic algorithm that adaptively mutates and never revisits
IEEE Transactions on Evolutionary Computation
Continuous non-revisiting genetic algorithm
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
A study of operator and parameter choices in non-revisiting genetic algorithm
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Analysis of (1+1) evolutionary algorithm and randomized local search with memory
Evolutionary Computation
Evolutionary programming made faster
IEEE Transactions on Evolutionary Computation
Parameter control in evolutionary algorithms
IEEE Transactions on Evolutionary Computation
Continuous non-revisiting genetic algorithm
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
A study of operator and parameter choices in non-revisiting genetic algorithm
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
International Journal of Metaheuristics
Hi-index | 0.00 |
The non-revisiting genetic algorithm (NrGA) is extended to handle continuous search space. The extended NrGA model, Continuous NrGA (cNrGA), employs the same tree-structure archive of NrGA to memorize the evaluated solutions, in which the search space is divided into non-overlapped partitions according to the distribution of the solutions. cNrGA is a bi-modulus evolutionary algorithm consisting of the genetic algorithm module (GAM) and the adaptive mutation module (AMM). When GAM generates an offspring, the offspring is sent to AMM and is mutated according to the density of the solutions stored in the memory archive. For a point in the search space with high solution-density, it infers a high probability that the point is close to the optimum and hence a near search is suggested. Alternatively, a far search is recommended for a point with low solution-density. Benefitting from the space partitioning scheme, a fast solution-density approximation is obtained. Also, the adaptive mutation scheme naturally avoid the generation of out-of-bound solutions. The performance of cNrGA is tested on 14 benchmark functions on dimensions ranging from 2 to 40. It is compared with real coded GA, differential evolution, covariance matrix adaptation evolution strategy and two improved particle swarm optimization. The simulation results show that cNrGA outperforms the other algorithms for multi-modal function optimization.