Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
A tool for the analysis of Quasi-Newton methods with application to unconstrained minimization
SIAM Journal on Numerical Analysis
On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
On the convergence property of the DFP algorithm
Annals of Operations Research
Convergence of the BFGS Method for LC1 Convex Constrained Optimization
SIAM Journal on Control and Optimization
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
A modified BFGS method and its global convergence in nonconvex minimization
Journal of Computational and Applied Mathematics - Special issue on nonlinear programming and variational inequalities
On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems
SIAM Journal on Optimization
Convergence Properties of the BFGS Algoritm
SIAM Journal on Optimization
The Superlinear Convergence of a Modified BFGS-Type Method for Unconstrained Optimization
Computational Optimization and Applications
A limited memory BFGS-type method for large-scale unconstrained optimization
Computers & Mathematics with Applications
A practical update criterion for SQP method
Optimization Methods & Software
Hi-index | 0.00 |
The limited memory BFGS method (L-BFGS) is an adaptation of the BFGS method for large-scale unconstrained optimization. However, The L-BFGS method need not converge for nonconvex objective functions and it is inefficient on highly ill-conditioned problems. In this paper, we proposed a regularization strategy on the L-BFGS method, where the used regularization parameter may play a compensation role in some sense when the condition number of Hessian approximation tends to become ill-conditioned. Then we proposed a regularized L-BFGS method and established its global convergence even when the objective function is nonconvex. Numerical results show that the proposed method is efficient.