Cooling schedules for optimal annealing
Mathematics of Operations Research
Data structures using C and C++ (2nd ed.)
Data structures using C and C++ (2nd ed.)
Tabu Search
On the analysis of the (1+ 1) evolutionary algorithm
Theoretical Computer Science
Theoretical Computer Science - Natural computing
Towards an analytic framework for analysing the computation time of evolutionary algorithms
Artificial Intelligence
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Crossover is provably essential for the Ising model on trees
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
On the Choice of the Offspring Population Size in Evolutionary Algorithms
Evolutionary Computation
Runtime Analysis of the (μ+1) EA on Simple Pseudo-Boolean Functions
Evolutionary Computation
On the analysis of the (1+1) memetic algorithm
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Randomized local search, evolutionary algorithms, and the minimum spanning tree problem
Theoretical Computer Science
Theoretical Computer Science
How mutation and selection solve long-path problems in polynomial expected time
Evolutionary Computation
Rigorous hitting times for binary mutations
Evolutionary Computation
Expected runtimes of evolutionary algorithms for the Eulerian cycle problem
Computers and Operations Research
A genetic algorithm that adaptively mutates and never revisits
IEEE Transactions on Evolutionary Computation
Statistical distribution of the convergence time of evolutionaryalgorithms for long-path problems
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation
The analysis of a recombinative hill-climber on H-IFF
IEEE Transactions on Evolutionary Computation
On modeling genetic algorithms for functions of unitation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Continuous non-revisiting genetic algorithm
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Hi-index | 0.00 |
This paper considers the scenario of the (1+1) evolutionary algorithm (EA) and randomized local search (RLS) with memory. Previously explored solutions are stored in memory until an improvement in fitness is obtained; then the stored information is discarded. This results in two new algorithms: (1+1) EA-m (with a raw list and hash table option) and RLS-m+ (and RLS-m if the function is a priori known to be unimodal). These two algorithms can be regarded as very simple forms of tabu search. Rigorous theoretical analysis of the expected time to find the globally optimal solutions for these algorithms is conducted for both unimodal and multimodal functions. A unified mathematical framework, involving the new concept of spatially invariant neighborhood, is proposed. Under this framework, both (1+1) EA with standard uniform mutation and RLS can be considered as particular instances and in the most general cases, all functions can be considered to be unimodal. Under this framework, it is found that for unimodal functions, the improvement by memory assistance is always positive but at most by one half. For multimodal functions, the improvement is significant; for functions with gaps and another hard function, the order of growth is reduced; for at least one example function, the order can change from exponential to polynomial. Empirical results, with a reasonable fitness evaluation time assumption, verify that (1+1) EA-m and RLS-m+ are superior to their conventional counterparts. Both new algorithms are promising for use in a memetic algorithm. In particular, RLS-m+ makes the previously impractical RLS practical, and surprisingly, does not require any extra memory in actual implementation.