A truncated Newton method with nonmonotone line search for unconstrained optimization
Journal of Optimization Theory and Applications
Line search algorithms with guaranteed sufficient decrease
ACM Transactions on Mathematical Software (TOMS)
Multi-step quasi-Newton methods for optimization
ICCAM'92 Proceedings of the fifth international conference on Computational and applied mathematics
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
Convergence Properties of Nonlinear Conjugate Gradient Methods
SIAM Journal on Optimization
Global Convergence Properties of Nonlinear Conjugate Gradient Methods with Modified Secant Condition
Computational Optimization and Applications
Journal of Computational and Applied Mathematics
Two new conjugate gradient methods based on modified secant equations
Journal of Computational and Applied Mathematics
SIAM Journal on Optimization
Journal of Computational and Applied Mathematics
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
Conjugate gradient methods are appealing for large scale nonlinear optimization problems, because they avoid the storage of matrices. Recently, seeking fast convergence of these methods, Dai and Liao (Appl. Math. Optim. 43:87---101, 2001) proposed a conjugate gradient method based on the secant condition of quasi-Newton methods, and later Yabe and Takano (Comput. Optim. Appl. 28:203---225, 2004) proposed another conjugate gradient method based on the modified secant condition. In this paper, we make use of a multi-step secant condition given by Ford and Moghrabi (Optim. Methods Softw. 2:357---370, 1993; J. Comput. Appl. Math. 50:305---323, 1994) and propose two new conjugate gradient methods based on this condition. The methods are shown to be globally convergent under certain assumptions. Numerical results are reported.