Efficient hybrid conjugate gradient techniques
Journal of Optimization Theory and Applications
Optimization: algorithms and consistent approximations
Optimization: algorithms and consistent approximations
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
Hi-index | 0.00 |
In this paper we propose two kinds of conjugate gradient methods for unconstrained optimization, based on the combinations of the presented conjugate gradient methods. The methods can be regarded as the modifications of the efficient hybrid methods proposed by Touati-Ahmed-Storey and Dai-Yuan. Under mild conditions, globally convergent results are proved. Primary numerical results of the new methods are encouraging.