Journal of Optimization Theory and Applications
Lipschitzian optimization without the Lipschitz constant
Journal of Optimization Theory and Applications
Neural networks for pattern recognition
Neural networks for pattern recognition
Trust-region methods
A Taxonomy of Global Optimization Methods Based on Response Surfaces
Journal of Global Optimization
Introduction to Stochastic Search and Optimization
Introduction to Stochastic Search and Optimization
Convex Optimization
Beyond Convex? Global Optimization is Feasible Only for Convex Objective Functions: A Theorem
Journal of Global Optimization
Cubic regularization of Newton method and its global performance
Mathematical Programming: Series A and B
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Introduction to Global Optimization (Nonconvex Optimization and Its Applications)
Introduction to Global Optimization (Nonconvex Optimization and Its Applications)
Hierarchical Nonlinear Approximation for Experimental Design and Statistical Data Fitting
SIAM Journal on Scientific Computing
SNOBFIT -- Stable Noisy Optimization by Branch and Fit
ACM Transactions on Mathematical Software (TOMS)
Mathematical Programming: Series A and B
Hi-index | 0.00 |
We present a branch and bound algorithm for the global optimization of a twice differentiable nonconvex objective function with a Lipschitz continuous Hessian over a compact, convex set. The algorithm is based on applying cubic regularisation techniques to the objective function within an overlapping branch and bound algorithm for convex constrained global optimization. Unlike other branch and bound algorithms, lower bounds are obtained via nonconvex underestimators of the function. For a numerical example, we apply the proposed branch and bound algorithm to radial basis function approximations.