Diffusion for global optimization in Rn
SIAM Journal on Control and Optimization
SIAM Journal on Applied Mathematics
Recursive stochastic algorithms for global optimization in Rd
SIAM Journal on Control and Optimization
Approximation of an analog diffusion network with applications to image estimation
Journal of Optimization Theory and Applications
Budget-Dependent Convergence Rate of Stochastic Approximation
SIAM Journal on Optimization
Rates of Convergence for a Class of Global Stochastic Optimization Algorithms
SIAM Journal on Optimization
Digital diffusion network for image segmentation
ICIP '95 Proceedings of the 1995 International Conference on Image Processing (Vol. 3)-Volume 3 - Volume 3
Analog optimization with Wong's stochastic neural network
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Motivated by the recent developments in digital diffusion networks, this work is devoted to the rates of convergence issue for a class of global optimization algorithms. By means of weak convergence methods, we show that a sequence of suitably scaled estimation errors converges weakly to a diffusion process (a solution of a stochastic differential equation). The scaling together with the stationary covariance of the limit diffusion process gives the desired rates of convergence. Application examples are also provided for some image estimation problems.