Journal of Optimization Theory and Applications
Evaluating derivatives: principles and techniques of algorithmic differentiation
Evaluating derivatives: principles and techniques of algorithmic differentiation
Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
Historical developments in convergence analysis for Newton's and Newton-like methods
Journal of Computational and Applied Mathematics - Special issue on numerical analysis 2000 Vol. IV: optimization and nonlinear equations
An improved inexact Newton method
Journal of Global Optimization
On large-scale unconstrained optimization problems and higher order methods
Optimization Methods & Software - The 2nd Veszprem Optimization Conference: Advanced Algorithms (VOCAL), 13-15 December 2006, Veszprem, Hungary
An efficient version on a new improved method of tangent hyperbolas
LSMS'07 Proceedings of the Life system modeling and simulation 2007 international conference on Bio-Inspired computational intelligence and applications
On the Halley class of methods for unconstrainedoptimization problems
Optimization Methods & Software - The 2nd International Conference on Nonlinear Programming with Applications
Hi-index | 0.00 |
Original Halley method, also referred to as the tangent hyperbolas method, is the classical third order method which can be used to solve the optimization problems. However, computing the derivative terms in solving the Halley equation becomes a key problem to reduce its computational cost. By using efficiently the techniques of automatic differentiation and preconditioned conjugate graduate method, original Halley method is established and improved in this paper. Theoretical and numerical results show that the improved version is more efficient than the original method.