Using Experimental Design to Find Effective Parameter Settings for Heuristics
Journal of Heuristics
A Racing Algorithm for Configuring Metaheuristics
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
Stochastic Local Search: Foundations & Applications
Stochastic Local Search: Foundations & Applications
Determining the Number of Clusters/Segments in Hierarchical Clustering/Segmentation Algorithms
ICTAI '04 Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence
Data Mining: Concepts and Techniques
Data Mining: Concepts and Techniques
Viz: a visual analysis suite for explaining local search behavior
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Fine-Tuning of Algorithms Using Fractional Experimental Designs and Local Search
Operations Research
Learning While Optimizing an Unknown Fitness Surface
Learning and Intelligent Optimization
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 3
Automatic algorithm configuration based on local search
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
SATzilla: portfolio-based algorithm selection for SAT
Journal of Artificial Intelligence Research
ParamILS: an automatic algorithm configuration framework
Journal of Artificial Intelligence Research
An integrated white+black box approach for designing and tuning stochastic local search
CP'07 Proceedings of the 13th international conference on Principles and practice of constraint programming
Fitness landscape analysis and memetic algorithms for the quadratic assignment problem
IEEE Transactions on Evolutionary Computation
Quantifying homogeneity of instance sets for algorithm configuration
LION'12 Proceedings of the 6th international conference on Learning and Intelligent Optimization
Hi-index | 0.00 |
This paper is concerned with automated tuning of parameters in local-search based meta-heuristics. Several generic approaches have been introduced in the literature that returns a ”one-size-fits-all” parameter configuration for all instances. This is unsatisfactory since different instances may require the algorithm to use very different parameter configurations in order to find good solutions. There have been approaches that perform instance-based automated tuning, but they are usually problem-specific. In this paper, we propose CluPaTra, a generic (problem-independent) approach to perform parameter tuning, based on CLUstering instances with similar PAtterns according to their search TRAjectories. We propose representing a search trajectory as a directed sequence and apply a well-studied sequence alignment technique to cluster instances based on the similarity of their respective search trajectories. We verify our work on the Traveling Salesman Problem (TSP) and Quadratic Assignment Problem (QAP). Experimental results show that CluPaTra offers significant improvement compared to ParamILS (a one-size-fits-all approach). CluPaTra is statistically significantly better compared with clustering using simple problem-specific features; and in comparison with the tuning of QAP instances based on a well-known distance and flow metric classification, we show that they are statistically comparable.