Instance-Based Learning Algorithms
Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Machine Learning
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
Handbook of Evolutionary Computation
Handbook of Evolutionary Computation
UEGO, an Abstract Clustering Technique for Multimodal Global Optimization
Journal of Heuristics
Journal of Global Optimization
Reliability and Performance of UEGO, a Clustering-based Global Optimizer
Journal of Global Optimization
The new k-windows algorithm for improving the k-means clustering algorithm
Journal of Complexity
An Investigation of Niche and Species Formation in Genetic Function Optimization
Proceedings of the 3rd International Conference on Genetic Algorithms
Case-Based Initialization of Genetic Algorithms
Proceedings of the 5th International Conference on Genetic Algorithms
Linkage Problem, Distribution Estimation, and Bayesian Networks
Evolutionary Computation
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
On initial populations of a genetic algorithm for continuous optimization problems
Journal of Global Optimization
Opposition versus randomness in soft computing techniques
Applied Soft Computing
BSB '09 Proceedings of the 4th Brazilian Symposium on Bioinformatics: Advances in Bioinformatics and Computational Biology
Quasi-random initial population for genetic algorithms
Computers & Mathematics with Applications
Fundamenta Informaticae - Swarm Intelligence
Information Sciences: an International Journal
Discovering promising regions to help global numerical optimization algorithms
MICAI'07 Proceedings of the artificial intelligence 6th Mexican international conference on Advances in artificial intelligence
Chemical-reaction-inspired metaheuristic for optimization
IEEE Transactions on Evolutionary Computation
Expert Systems with Applications: An International Journal
Information Sciences: an International Journal
A study on scale factor in distributed differential evolution
Information Sciences: an International Journal
Disturbed Exploitation compact Differential Evolution for limited memory optimization problems
Information Sciences: an International Journal
Using datamining techniques to help metaheuristics: a short survey
HM'06 Proceedings of the Third international conference on Hybrid Metaheuristics
A synthesis system for analog circuits based on evolutionary search and topological reuse
IEEE Transactions on Evolutionary Computation
Opposition-Based Differential Evolution
IEEE Transactions on Evolutionary Computation
Adaptive Memetic Differential Evolution with Global and Local neighborhood-based mutation operators
Information Sciences: an International Journal
Hi-index | 0.07 |
Recently, researches have shown that the performance of metaheuristics can be affected by population initialization. Opposition-based Differential Evolution (ODE), Quasi-Oppositional Differential Evolution (QODE), and Uniform-Quasi-Opposition Differential Evolution (UQODE) are three state-of-the-art methods that improve the performance of the Differential Evolution algorithm based on population initialization and different search strategies. In a different approach to achieve similar results, this paper presents a technique to discover promising regions in a continuous search-space of an optimization problem. Using machine-learning techniques, the algorithm named Smart Sampling (SS) finds regions with high possibility of containing a global optimum. Next, a metaheuristic can be initialized inside each region to find that optimum. SS and DE were combined (originating the SSDE algorithm) to evaluate our approach, and experiments were conducted in the same set of benchmark functions used by ODE, QODE and UQODE authors. Results have shown that the total number of function evaluations required by DE to reach the global optimum can be significantly reduced and that the success rate improves if SS is employed first. Such results are also in consonance with results from the literature, stating the importance of an adequate starting population. Moreover, SS presents better efficacy to find initial populations of superior quality when compared to the other three algorithms that employ oppositional learning. Finally and most important, the SS performance in finding promising regions is independent of the employed metaheuristic with which SS is combined, making SS suitable to improve the performance of a large variety of optimization techniques.