Efficient hybrid conjugate gradient techniques
Journal of Optimization Theory and Applications
Efficient generalized conjugate gradient algorithms, Part 1: theory
Journal of Optimization Theory and Applications
Global convergence result for conjugate gradient methods
Journal of Optimization Theory and Applications
A globally convergent version of the Polak-Ribière conjugate gradient method
Mathematical Programming: Series A and B
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
SIAM Journal on Optimization
A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
SIAM Journal on Optimization
A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
SIAM Journal on Optimization
Convergence analysis of a modified BFGS method on convex minimizations
Computational Optimization and Applications
A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
Journal of Computational and Applied Mathematics
Hi-index | 7.29 |
A modified conjugate gradient method is presented for solving unconstrained optimization problems, which possesses the following properties: (i) The sufficient descent property is satisfied without any line search; (ii) The search direction will be in a trust region automatically; (iii) The Zoutendijk condition holds for the Wolfe-Powell line search technique; (iv) This method inherits an important property of the well-known Polak-Ribiere-Polyak (PRP) method: the tendency to turn towards the steepest descent direction if a small step is generated away from the solution, preventing a sequence of tiny steps from happening. The global convergence and the linearly convergent rate of the given method are established. Numerical results show that this method is interesting.