A guided tour of Chernoff bounds
Information Processing Letters
Randomized algorithms
Introduction to Algorithms
How to analyse evolutionary algorithms
Theoretical Computer Science - Natural computing
Maximum cardinality matchings on trees by randomized local search
Proceedings of the 8th annual conference on Genetic and evolutionary computation
A genetic algorithm for the longest common subsequence problem
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Minimum spanning trees made easier via multi-objective optimization
Natural Computing: an international journal
Analysis of evolutionary algorithms for the longest common subsequence problem
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Expected runtimes of evolutionary algorithms for the Eulerian cycle problem
Computers and Operations Research
Starting from scratch: growing longest common subsequences with evolution
PPSN'06 Proceedings of the 9th international conference on Parallel Problem Solving from Nature
Simulated annealing beats metropolis in combinatorial optimization
ICALP'05 Proceedings of the 32nd international conference on Automata, Languages and Programming
Statistical analysis of parameter setting in real-coded evolutionary algorithms
PPSN'10 Proceedings of the 11th international conference on Parallel problem solving from nature: Part II
Hi-index | 0.00 |
Simulated Annealing is a probabilistic search heuristic for solving optimization problems and is used with great success on real life problems. In its standard form Simulated Annealing has two parameters, namely the initial temperature and the cooldown factor. In literature there are only rules of the thumb for choosing appropriate parameter values. This paper investigates the influence of different values for these two parameters on the optimization process from a theoretical point of view and presents some criteria for problem specific adjusting of these parameters. With these results the performance of the Simulated Annealing algorithm on solving the Longest Common Subsequence Problem is analysed using different values for the two parameters mentioned above. For all these parameter settings it is proved that even rather simple input instances of the Longest Common Subsequence Problem can neither be solved to optimality nor approximately up to an approximation factor arbitrarily close to 2 efficiently.