Combinatorial optimization: algorithms and complexity
Combinatorial optimization: algorithms and complexity
Minimizing conflicts: a heuristic repair method for constraint satisfaction and scheduling problems
Artificial Intelligence - Special volume on constraint-based reasoning
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Genetic Algorithms: Minimal Conditions for Convergence
AE '97 Selected Papers from the Third European Conference on Artificial Evolution
Scheduling Space–Ground Communications for the Air Force Satellite Control Network
Journal of Scheduling
Stochastic Local Search: Foundations & Applications
Stochastic Local Search: Foundations & Applications
Theoretical Aspects of Local Search (Monographs in Theoretical Computer Science. An EATCS Series)
Theoretical Aspects of Local Search (Monographs in Theoretical Computer Science. An EATCS Series)
GIB: imperfect information in a computationally challenging game
Journal of Artificial Intelligence Research
Journal of Artificial Intelligence Research
An evolutionary squeaky wheel optimization approach to personnel scheduling
IEEE Transactions on Evolutionary Computation
A squeaky wheel optimisation methodology for two-dimensional strip packing
Computers and Operations Research
Handbook of Metaheuristics
Ant system: optimization by a colony of cooperating agents
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.