Cramér–Rao bound analysis of positioning approaches in GNSS receivers
IEEE Transactions on Signal Processing
DSP'09 Proceedings of the 16th international conference on Digital Signal Processing
Blind source separation of continuous-time chaotic signals based on fast random search algorithm
IEEE Transactions on Circuits and Systems II: Express Briefs
Mixture surrogate models based on Dempster-Shafer theory for global optimization problems
Journal of Global Optimization
A direct bootstrapping technique and its application to a novel goodness of fit test
Journal of Multivariate Analysis
A conditional gaussian martingale algorithm for global optimization
ICCSA'06 Proceedings of the 2006 international conference on Computational Science and Its Applications - Volume Part III
CARTopt: a random search method for nonsmooth unconstrained optimization
Computational Optimization and Applications
Hi-index | 0.00 |
A new variant of pure random search (PRS) for function optimization is introduced. The basic finite-descent accelerated random search (ARS) algorithm is simple: the search is confined to shrinking neighborhoods of a previous record-generating value, with the search neighborhood reinitialized to the entire space when a new record is found. Local maxima are avoided by including an automatic restart feature which reinitializes the search neighborhood after some number of shrink steps have been performed.One goal of this article is to provide rigorous mathematical comparisons of ARS to PRS. It is shown that the sequence produced by the ARS process converges, with probability one, to the maximum of a continuous objective function faster than that of the PRS process by adjustably large multiples of the time step (Theorem). Regarding an infinite-descent (no automatic restart) version of ARS, it is shown that if the objective function satisfies a local nonflatness condition, then the right tails of the distributions of inter-record times are exponentially smaller than those of PRS (Theorem).Performance comparisons between ARS, PRS, and three quasi-Newton-type optimization routines are reported in attempting to find extrema of (i) each of a small collection of standard test functions of two variables, and (ii) d-dimensional polynomials with random roots. Also reported is a three-way performance comparison between ARS, PRS, and a simulated annealing algorithm in attempting to solve traveling salesman problems.