Journal of Optimization Theory and Applications
Evaluating derivatives: principles and techniques of algorithmic differentiation
Evaluating derivatives: principles and techniques of algorithmic differentiation
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
An improved inexact Newton method
Journal of Global Optimization
Original Halley method and its improvement with automatic differentiation
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 4
On diagonally structured problems in unconstrained optimization using an inexact super Halley method
Journal of Computational and Applied Mathematics
Rate of convergence of higher order methods
Applied Numerical Mathematics
Hi-index | 0.00 |
An new inexact method of tangent hyperbolas (NIMTH) has been proposed recently. In NIMTH, the Newton equation and the Newton-like equation are solved respectively by one Cholesky factorization (CF) step and p preconditioned conjugate gradient (PCG) steps, periodically. The algorithm is efficient in theory. But its implementation is still restricted. In this paper, an efficient version of NIMTH is presented, in which the parameter p is independent of the complexity of the objective function, and its tensor terms can be efficiently evaluated by automatic differentiation. Further theoretical analysis and numerical experiments show that this version of NIMTH is of great competition for the middle and large scale unconstrained optimization problems.