Evaluating derivatives: principles and techniques of algorithmic differentiation
Evaluating derivatives: principles and techniques of algorithmic differentiation
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
Automatic differentiation of algorithms
Journal of Computational and Applied Mathematics - Special issue on numerical analysis 2000 Vol. IV: optimization and nonlinear equations
An Inexact Newton Method Derived from Efficiency Analysis
Journal of Global Optimization
Automatic Differentiation: Applications, Theory, and Implementations (Lecture Notes in Computational Science and Engineering)
An efficient version on a new improved method of tangent hyperbolas
LSMS'07 Proceedings of the Life system modeling and simulation 2007 international conference on Bio-Inspired computational intelligence and applications
Original Halley method and its improvement with automatic differentiation
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 4
Hi-index | 0.00 |
For unconstrained optimization, an inexact Newton algorithm is proposed recently, in which the preconditioned conjugate gradient method is applied to solve the Newton equations. In this paper, we improve this algorithm by efficiently using automatic differentiation and establish a new inexact Newton algorithm. Based on the efficiency coefficient defined by Brent, a theoretical efficiency ratio of the new algorithm to the old algorithm is introduced. It has been shown that this ratio is greater than 1, which implies that the new algorithm is always more efficient than the old one. Furthermore, this improvement is significant at least for some cases. This theoretical conclusion is supported by numerical experiments.