Combinatorial optimization: algorithms and complexity
Combinatorial optimization: algorithms and complexity
Fundamentals of statistical exponential families: with applications in statistical decision theory
Fundamentals of statistical exponential families: with applications in statistical decision theory
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Introduction to Evolutionary Computing
Introduction to Evolutionary Computing
A maximum likelihood algorithm for the estimation and renormalization of exponential densities
Journal of Computational Physics
Introduction to Global Optimization (Nonconvex Optimization and Its Applications)
Introduction to Global Optimization (Nonconvex Optimization and Its Applications)
Graphical Models, Exponential Families, and Variational Inference
Foundations and Trends® in Machine Learning
A particle filtering framework for randomized optimization algorithms
Proceedings of the 40th Conference on Winter Simulation
Monte Carlo Strategies in Scientific Computing
Monte Carlo Strategies in Scientific Computing
A survey of convergence results on particle filtering methods forpractitioners
IEEE Transactions on Signal Processing
Hi-index | 31.45 |
We present a reformulation of stochastic global optimization as a filtering problem. The motivation behind this reformulation comes from the fact that for many optimization problems we cannot evaluate exactly the objective function to be optimized. Similarly, we may not be able to evaluate exactly the functions involved in iterative optimization algorithms. For example, we may only have access to noisy measurements of the functions or statistical estimates provided through Monte Carlo sampling. This makes iterative optimization algorithms behave like stochastic maps. Naive global optimization amounts to evolving a collection of realizations of this stochastic map and picking the realization with the best properties. This motivates the use of filtering techniques to allow focusing on realizations that are more promising than others. In particular, we present a filtering reformulation of global optimization in terms of a special case of sequential importance sampling methods called particle filters. The increasing popularity of particle filters is based on the simplicity of their implementation and their flexibility. We utilize the flexibility of particle filters to construct a stochastic global optimization algorithm which can converge to the optimal solution appreciably faster than naive global optimization. Several examples of parametric exponential density estimation are provided to demonstrate the efficiency of the approach.