ACM Transactions on Mathematical Software (TOMS)
Markov chains with rare transitions and simulated annealing
Mathematics of Operations Research
Computational experience with generalized simulated annealing over continuous variables
American Journal of Mathematical and Management Sciences
Stochastic minimization of Lipschitz functions
Recent advances in global optimization
Pure adaptive search in global optimization
Mathematical Programming: Series A and B
Hit-and-run algorithms for generating multivariate distributions
Mathematics of Operations Research
Simulated annealing algorithms for continuous global optimization: convergence conditions
Journal of Optimization Theory and Applications
Convergence of the simulated annealing algorithm for continuous global optimization
Journal of Optimization Theory and Applications
Accelerating the convergence of random search methods for discrete stochastic optimization
ACM Transactions on Modeling and Computer Simulation (TOMACS)
A class of convergent generalized hill climbing algorithms
Applied Mathematics and Computation
Direction Choice for Accelerated Convergence in Hit-And-Run Sampling
Operations Research
Simulation Optimization: Integrating Research and Practice
INFORMS Journal on Computing
Journal of Global Optimization
Computational Optimization and Applications
Relative Frequencies of Generalized Simulated Annealing
Mathematics of Operations Research
Simulated Annealing for Convex Optimization
Mathematics of Operations Research
An analytically derived cooling schedule for simulated annealing
Journal of Global Optimization
Adaptive search with stochastic acceptance probabilities for global optimization
Operations Research Letters
Optimization Methods & Software - GLOBAL OPTIMIZATION
Hi-index | 0.00 |
We build on improving hit-and-run's (IHR) prior success as a Monte Carlo random search algorithm for global optimization by generalizing the algorithm's sampling distribution. Specifically, in place of the uniform step-size distribution in IHR, we employ a family of parameterized step-size distributions to sample candidate points. The IHR step-size distribution is a special instance within this family. This parameterization is motivated by recent results on efficient decentralized search in the so-called Small World problems. To improve the performance of the algorithm, we adaptively tune the parameter based on the success rate of obtaining improving points. We present analytical and numerical results on simple spherical programmes to illustrate the key ideas of the relationship between the parametrization and algorithm performance. These results are then extended to global optimization problems with Lipschitz continuous objective functions. Our preliminary numerical results demonstrate the potential benefit of considering parameterized versions of IHR.