Machine Learning
Using Experimental Design to Find Effective Parameter Settings for Heuristics
Journal of Heuristics
Efficient Global Optimization of Expensive Black-Box Functions
Journal of Global Optimization
A Racing Algorithm for Configuring Metaheuristics
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
Learning the Empirical Hardness of Optimization Problems: The Case of Combinatorial Auctions
CP '02 Proceedings of the 8th International Conference on Principles and Practice of Constraint Programming
Scaling and Probabilistic Smoothing: Efficient Dynamic Local Search for SAT
CP '02 Proceedings of the 8th International Conference on Principles and Practice of Constraint Programming
Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models
Journal of Global Optimization
Experimental Research in Evolutionary Computation: The New Experimentalism (Natural Computing Series)
Finding Optimal Algorithmic Parameters Using Derivative-Free Optimization
SIAM Journal on Optimization
Fine-Tuning of Algorithms Using Fractional Experimental Designs and Local Search
Operations Research
Automatic algorithm configuration based on local search
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
Improvement strategies for the F-Race algorithm: sampling design and iterative refinement
HM'07 Proceedings of the 4th international conference on Hybrid metaheuristics
UBCSAT: an implementation and experimentation environment for SLS algorithms for SAT and MAX-SAT
SAT'04 Proceedings of the 7th international conference on Theory and Applications of Satisfiability Testing
ParamILS: an automatic algorithm configuration framework
Journal of Artificial Intelligence Research
On the generality of parameter tuning in evolutionary planning
Proceedings of the 12th annual conference on Genetic and evolutionary computation
Modern continuous optimization algorithms for tuning real and integer algorithm parameters
ANTS'10 Proceedings of the 7th international conference on Swarm intelligence
Time-bounded sequential parameter optimization
LION'10 Proceedings of the 4th international conference on Learning and intelligent optimization
Tradeoffs in the empirical evaluation of competing algorithm designs
Annals of Mathematics and Artificial Intelligence
Review: Measuring instance difficulty for combinatorial optimization problems
Computers and Operations Research
Automated configuration of mixed integer programming solvers
CPAIOR'10 Proceedings of the 7th international conference on Integration of AI and OR Techniques in Constraint Programming for Combinatorial Optimization Problems
Sequential model-based optimization for general algorithm configuration
LION'05 Proceedings of the 5th international conference on Learning and Intelligent Optimization
On the effect of response transformations in sequential parameter optimization
Evolutionary Computation
On the anytime behavior of IPOP-CMA-ES
PPSN'12 Proceedings of the 12th international conference on Parallel Problem Solving from Nature - Volume Part I
An analysis of post-selection in automatic configuration
Proceedings of the 15th annual conference on Genetic and evolutionary computation
Is the meta-EA a viable optimization method?
Proceedings of the 15th annual conference on Genetic and evolutionary computation
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
An evaluation of sequential model-based optimization for expensive blackbox functions
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Algorithm runtime prediction: Methods & evaluation
Artificial Intelligence
A beginner's guide to tuning methods
Applied Soft Computing
Hi-index | 0.00 |
This work experimentally investigates model-based approaches for optimising the performance of parameterised randomised algorithms. We restrict our attention to procedures based on Gaussian process models, the most widely-studied family of models for this problem. We evaluated two approaches from the literature, and found that sequential parameter optimisation (SPO) [4] offered the most robust performance. We then investigated key design decisions within the SPO paradigm, characterising the performance consequences of each. Based on these findings, we propose a new version of SPO, dubbed SPO+, which extends SPO with a novel intensification procedure and log-transformed response values. Finally, in a domain for which performance results for other (model-free) parameter optimisation approaches are available, we demonstrate that SPO+ achieves state-of-the-art performance.