Stochastic global optimization methods. part 1: clustering methods
Mathematical Programming: Series A and B
Stochastic global optimization methods. part 11: multi level methods
Mathematical Programming: Series A and B
Global optimization
USSR Computational Mathematics and Mathematical Physics
Random number generation and quasi-Monte Carlo methods
Random number generation and quasi-Monte Carlo methods
Implementation and tests of low-discrepancy sequences
ACM Transactions on Modeling and Computer Simulation (TOMACS)
Parallel algorithms for global optimization
Journal of Optimization Theory and Applications
Random and Quasi-Random Linkage Methods in Global Optimization
Journal of Global Optimization
Stochastic Global Optimization: Problem Classes and Solution Techniques
Journal of Global Optimization
Encyclopedia of Optimization
New formulations for the Kissing Number Problem
Discrete Applied Mathematics
Reformulation in mathematical programming: An application to quantum chemistry
Discrete Applied Mathematics
A hybrid meta-heuristic for global optimisation using low-discrepancy sequences of points
Computers and Operations Research
Benchmarking a hybrid multi level single linkagealgorithm on the bbob noiseless testbed
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Model-based pose estimation for rigid objects
ICVS'13 Proceedings of the 9th international conference on Computer Vision Systems
Computational Optimization and Applications
Hi-index | 0.00 |
It has been recognized through theory and practice that uniformly distributed deterministic sequences provide more accurate results than purely random sequences. A quasi Monte Carlo (QMC) variant of a multi level single linkage (MLSL) algorithm for global optimization is compared with an original stochastic MLSL algorithm for a number of test problems of various complexities. An emphasis is made on high dimensional problems. Two different low-discrepancy sequences (LDS) are used and their efficiency is analysed. It is shown that application of LDS can significantly increase the efficiency of MLSL. The dependence of the sample size required for locating global minima on the number of variables is examined. It is found that higher confidence in the obtained solution and possibly a reduction in the computational time can be achieved by the increase of the total sample size N. N should also be increased as the dimensionality of problems grows. For high dimensional problems clustering methods become inefficient. For such problems a multistart method can be more computationally expedient.