ACM Transactions on Mathematical Software (TOMS)
Convergence Properties of the Nelder--Mead Simplex Method in Low Dimensions
SIAM Journal on Optimization
Facts, Conjectures, and Improvements for Simulated Annealing
Facts, Conjectures, and Improvements for Simulated Annealing
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
Learning probability distributions in continuous evolutionary algorithms– a comparative review
Natural Computing: an international journal
Hi-index | 0.00 |
This paper extends two optimization routines to deal with objective functions for DSGE models. The optimization routines are (1) a version of Simulated Annealing developed by Corana A, Marchesi M, Ridella (ACM Trans Math Softw 13(3):262---280, 1987), and (2) the evolutionary algorithm CMA-ES developed by Hansen, Müller, Koumoutsakos (Evol Comput 11(1), 2003). Following these extensions, we examine the ability of the two routines to maximize the likelihood function for a sequence of test economies. Our results show that the CMA-ES routine clearly outperforms Simulated Annealing in its ability to find the global optimum and in efficiency. With ten unknown structural parameters in the likelihood function, the CMA-ES routine finds the global optimum in 95% of our test economies compared to 89% for Simulated Annealing. When the number of unknown structural parameters in the likelihood function increases to 20 and 35, then the CMA-ES routine finds the global optimum in 85 and 71% of our test economies, respectively. The corresponding numbers for Simulated Annealing are 70 and 0%.