Nonlinear optimization: complexity issues
Nonlinear optimization: complexity issues
Approximation algorithms for indefinite quadratic programming
Mathematical Programming: Series A and B
Trust-region methods
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
Cubic regularization of Newton method and its global performance
Mathematical Programming: Series A and B
Accelerating the cubic regularization of Newton’s method on convex problems
Mathematical Programming: Series A and B
Affine conjugate adaptive Newton methods for nonlinear elastomechanics
Optimization Methods & Software
Recursive Trust-Region Methods for Multiscale Nonlinear Optimization
SIAM Journal on Optimization
On the geometry phase in model-based algorithms for derivative-free optimization
Optimization Methods & Software
Introduction to Derivative-Free Optimization
Introduction to Derivative-Free Optimization
Mathematical Programming: Series A and B
Complexity bounds for second-order optimality in unconstrained optimization
Journal of Complexity
Self-Correcting Geometry in Model-Based Algorithms for Derivative-Free Unconstrained Optimization
SIAM Journal on Optimization
Mathematical Programming: Series A and B
Hi-index | 0.00 |
The (optimal) function/gradient evaluations worst-case complexity analysis available for the adaptive regularization algorithms with cubics (ARC) for nonconvex smooth unconstrained optimization is extended to finite-difference versions of this algorithm, yielding complexity bounds for first-order and derivative-free methods applied on the same problem class. A comparison with the results obtained for derivative-free methods by Vicente [Worst Case Complexity of Direct Search, Technical report, Preprint 10-17, Department of Mathematics, University of Coimbra, Coimbra, Portugal, 2010] is also discussed, giving some theoretical insight into the relative merits of various methods in this popular class of algorithms.