Particle Swarm Model Selection
The Journal of Machine Learning Research
ParamILS: an automatic algorithm configuration framework
Journal of Artificial Intelligence Research
Parameter tuning of evolutionary algorithms: generalist vs. specialist
EvoApplicatons'10 Proceedings of the 2010 international conference on Applications of Evolutionary Computation - Volume Part I
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
Currently, there is no solution, which does not require a high runtime, to the problem of choosing preprocessing methods, feature selection algorithms and classifiers for a supervised learning problem. In this paper we present a method for efficiently finding a combination of algorithms and parameters that effectively describes a dataset. Furthermore, we present an optimization technique, based on ParamILS, which can be used in other contexts where each evaluation of the objective function is highly time consuming, but an estimate of this function is possible. In this paper, we present our algorithm and initial validation of it over real and synthetic data. In said validation, our proposal demonstrates a significant reduction in runtime, compared to ParamILS, while solving problems with these characteristics.