A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
Molecular conformations from distance matrices
Journal of Computational Chemistry
Gradient Method with Retards and Generalizations
SIAM Journal on Numerical Analysis
Estimation of the optical constants and the thickness of thin films using unconstrained optimization
Journal of Computational Physics
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
A new trust region method with adaptive radius
Computational Optimization and Applications
A new gradient method via quasi-Cauchy relation which guarantees descent
Journal of Computational and Applied Mathematics
Hybrid spectral gradient method for the unconstrained minimization problem
Journal of Global Optimization
Two new conjugate gradient methods based on modified secant equations
Journal of Computational and Applied Mathematics
A new two-step gradient-type method for large-scale unconstrained optimization
Computers & Mathematics with Applications
Notes on the Dai-Yuan-Yuan modified spectral gradient method
Journal of Computational and Applied Mathematics
An improved multi-step gradient-type method for large scale optimization
Computers & Mathematics with Applications
A Barzilai-Borwein-based heuristic algorithm for locating multiple facilities with regional demand
Computational Optimization and Applications
A Dynamical Tikhonov Regularization for Solving Ill-posed Linear Algebraic Systems
Acta Applicandae Mathematicae: an international survey journal on applying mathematics and mathematical applications
Modified subspace Barzilai-Borwein gradient method for non-negative matrix factorization
Computational Optimization and Applications
Two modified scaled nonlinear conjugate gradient methods
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
For unconstrained optimization, the two-point stepsize gradient method is preferable over the classical steepest descent method both in theory and in real computations. In this paper we interpret the choice for the stepsize in the two-point stepsize gradient method from the angle of interpolation and propose two modified two-point stepsize gradient methods. The modified methods are globally convergent under some mild assumptions on the objective function. Numerical results are reported, which suggest that improvements have been achieved.