A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
Newton's method for B-differentiable equations
Mathematics of Operations Research
Solving combinatorial optimization problems using Karmakar's algorithm
Mathematical Programming: Series A and B
A nonsmooth version of Newton's method
Mathematical Programming: Series A and B
Convergence analysis of some algorithms for solving nonsmooth equations
Mathematics of Operations Research
Mathematical Programming: Series A and B
A globally convergent Newton method for convex SC1minimization problems
Journal of Optimization Theory and Applications
Primal-dual interior-point methods
Primal-dual interior-point methods
Mathematics of Operations Research
A Theoretical and Numerical Comparison of Some Semismooth Algorithms for Complementarity Problems
Computational Optimization and Applications
Merit functions for complementarity and related problems: a survey
Computational Optimization and Applications - Special issue on nonsmooth and smoothing methods
Smoothing Functions for Second-Order-Cone Complementarity Problems
SIAM Journal on Optimization
An Efficient Algorithm for Minimizing a Sum of Euclidean Norms with Applications
SIAM Journal on Optimization
Warm-Start Strategies in Interior-Point Methods for Linear Programming
SIAM Journal on Optimization
Reoptimization With the Primal-Dual Interior Point Method
SIAM Journal on Optimization
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
The Q method for second order cone programming
Computers and Operations Research
Hi-index | 0.00 |
We develop optimality conditions for the second-order cone program. Our optimality conditions are well-defined and smooth everywhere. We then reformulate the optimality conditions into several systems of equations. Starting from a solution to the original problem, the sequence generated by Newton's method converges Q-quadratically to a solution of the perturbed problem under some assumptions. We globalize the algorithm by (1) extending the gradient descent method for differentiable optimization to minimizing continuous functions that are almost everywhere differentiable; (2) finding a directional derivative of the equations. Numerical examples confirm that our algorithm is good for "warm starting" second-order cone programs--in some cases, the solution of a perturbed instance is hit in two iterations. In the progress of our algorithm development, we also generalize the nonlinear complementarity function approach for two variables to several variables.