A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
CUTE: constrained and unconstrained testing environment
ACM Transactions on Mathematical Software (TOMS)
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
SIAM Journal on Optimization
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
A New Active Set Algorithm for Box Constrained Optimization
SIAM Journal on Optimization
Robust Stochastic Approximation Approach to Stochastic Programming
SIAM Journal on Optimization
Hi-index | 0.00 |
A new nonmonotone algorithm is proposed and analyzed for unconstrained nonlinear optimization. The nonmonotone techniques applied in this algorithm are based on the estimate sequence proposed by Nesterov (Introductory Lectures on Convex Optimization: A Basic Course, 2004) for convex optimization. Under proper assumptions, global convergence of this algorithm is established for minimizing general nonlinear objective function with Lipschitz continuous derivatives. For convex objective function, this algorithm maintains the optimal convergence rate of convex optimization. In numerical experiments, this algorithm is specified by employing safe-guarded nonlinear conjugate gradient search directions. Numerical results show the nonmonotone algorithm performs significantly better than the corresponding monotone algorithm for solving the unconstrained optimization problems in the CUTEr (Bongartz et al. in ACM Trans. Math. Softw. 21:123---160, 1995) library.