Stopping rules for a random optimization method
SIAM Journal on Control and Optimization
Optimal selection of statistical units: an approach via simulated annealing
Computational Statistics & Data Analysis
Identification of multivariate AR-models by threshold accepting
Computational Statistics & Data Analysis
Theoretical Computer Science
Optimal exact experimental designs with correlated errors through a simulated annealing algorithm
Computational Statistics & Data Analysis
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Tabu Search
SIAM Journal on Optimization
Journal of Global Optimization
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Journal of Global Optimization
Stability of Stochastic Approximation under Verifiable Conditions
SIAM Journal on Control and Optimization
Journal of Global Optimization
On learning strategies for evolutionary Monte Carlo
Statistics and Computing
On population-based simulation for static inference
Statistics and Computing
Monte Carlo Strategies in Scientific Computing
Monte Carlo Strategies in Scientific Computing
Comprehensive learning particle swarm optimizer for global optimization of multimodal functions
IEEE Transactions on Evolutionary Computation
Using genetic algorithms to parameters (d,r) estimation for threshold autoregressive models
Computational Statistics & Data Analysis
Genetic algorithms for the identification of additive and innovation outliers in time series
Computational Statistics & Data Analysis
Global optimization using the asymptotically independent Markov sampling method
Computers and Structures
Hi-index | 0.00 |
In this paper, we propose a new algorithm, the so-called annealing evolutionary stochastic approximation Monte Carlo (AESAMC) algorithm as a general optimization technique, and study its convergence. AESAMC possesses a self-adjusting mechanism, whose target distribution can be adapted at each iteration according to the current samples. Thus, AESAMC falls into the class of adaptive Monte Carlo methods. This mechanism also makes AESAMC less trapped by local energy minima than nonadaptive MCMC algorithms. Under mild conditions, we show that AESAMC can converge weakly toward a neighboring set of global minima in the space of energy. AESAMC is tested on multiple optimization problems. The numerical results indicate that AESAMC can potentially outperform simulated annealing, the genetic algorithm, annealing stochastic approximation Monte Carlo, and some other metaheuristics in function optimization.