Stochastic algorithms with Armijo stepsizes for minimization of functions
Journal of Optimization Theory and Applications
A scaled stochastic approximation algorithm
Management Science
Almost surely convergent global optimization algorithm using noise-corrupted observations
Journal of Optimization Theory and Applications
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
A Derivative-Free Algorithm for Bound Constrained Optimization
Computational Optimization and Applications
Introduction to Stochastic Search and Optimization
Introduction to Stochastic Search and Optimization
CSSE '08 Proceedings of the 2008 International Conference on Computer Science and Software Engineering - Volume 01
Hi-index | 0.00 |
A gradient method for solving unconstrained minimization problems in noisy environment is proposed and analyzed. The method combines line-search technique with Stochastic Approximation (SA) method. A line-search along the negative gradient direction is applied while the iterates are far away from the solution and upon reaching some neighborhood of the solution the method switches to SA rule. The main issue is to determine the switching point and that is resolved both theoretically and practically. The main result is the almost sure convergence of the proposed method due to a finite number of line-search steps followed by infinitely many SA consecutive steps. The numerical results obtained on a set of standard test problems confirm theoretical expectations and demonstrate the efficiency of the method.