Minimization methods for non-differentiable functions
Minimization methods for non-differentiable functions
Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Journal of Optimization Theory and Applications
Nonmonotonic trust region algorithm
Journal of Optimization Theory and Applications
Superlinearly convergent approximate Newton methods for LC1 optimization problems
Mathematical Programming: Series A and B
Global convergence of a two-parameter family of conjugate gradient methods without line search
Journal of Computational and Applied Mathematics - Special issue: Papers presented at the 1st Sino--Japan optimization meeting, 26-28 October 2000, Hong Kong, China
Combining trust region and linesearch algorithm for equality constrained optimization
Journal of Applied Mathematics and Computing
A new nonmonotone line search technique for unconstrained optimization
Journal of Computational and Applied Mathematics
A hybrid trust region algorithm for unconstrained optimization
Applied Numerical Mathematics
A hybrid ODE-based method for unconstrained optimization problems
Computational Optimization and Applications
A modified ODE-based algorithm for unconstrained optimization problems
Numerical Algorithms
Hi-index | 7.29 |
In this paper, a new trust region algorithm is proposed for solving unconstrained optimization problems. This method can be regarded as a combination of trust region technique, fixed step-length and ODE-based methods. A feature of this proposed method is that at each iteration, only a system of linear equations is solved to obtain a trial step. Another is that when a trial step is not accepted, the method generates an iterative point whose step-length is defined by a formula. Under some standard assumptions, it is proven that the algorithm is globally convergent and locally superlinear convergent. Preliminary numerical results are reported.