Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
Journal of Computational and Applied Mathematics
A new family of conjugate gradient methods
Journal of Computational and Applied Mathematics
A nonmonotone conic trust region method based on line search for solving unconstrained optimization
Journal of Computational and Applied Mathematics
An ODE-based trust region method for unconstrained optimization problems
Journal of Computational and Applied Mathematics
A new hybrid method for nonlinear complementarity problems
Computational Optimization and Applications
A novel, parallel PDE solver for unstructured grids
LSSC'05 Proceedings of the 5th international conference on Large-Scale Scientific Computing
Hi-index | 0.00 |
We study the global convergence of a two-parameter family of conjugate gradient methods in which the line search procedure is replaced by a fixed formula of stepsize. This character is of significance if the line search is expensive in a particular application. In addition to the convergence results, we present computational results for various conjugate gradient methods without line search including those discussed by Sun and Zhang. (Ann. Oper. Res. 103 (2001) 161-173).