Topics in matrix analysis
ACM Transactions on Mathematical Software (TOMS)
Semi-implicit finite difference methods for the two-dimensional shallow water equation
Journal of Computational Physics
Newton's method for B-differentiable equations
Mathematics of Operations Research
Newton's method for the nonlinear complementarity problem: a B-differentiable equation problem
Mathematical Programming: Series A and B
Convergence analysis of some algorithms for solving nonsmooth equations
Mathematics of Operations Research
Machine Learning
A Newton-type method for positive-semidefinite linear complementarity problems
Journal of Optimization Theory and Applications
On finite termination of an iterative method for linear complementarity problems
Mathematical Programming: Series A and B
Primal-dual interior-point methods
Primal-dual interior-point methods
SIAM Journal on Optimization
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Convex Optimization
Training a Support Vector Machine in the Primal
Neural Computation
Iterative Solution of Piecewise Linear Systems
SIAM Journal on Scientific Computing
Group lasso with overlap and graph lasso
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Iterative Solution of Piecewise Linear Systems and Applications to Flows in Porous Media
SIAM Journal on Scientific Computing
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
TRESNEI, a Matlab trust-region solver for systems of nonlinear equalities and inequalities
Computational Optimization and Applications
De-noising by soft-thresholding
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We investigate Newton-type optimization methods for solving piecewise linear systems (PLSs) with nondegenerate coefficient matrix. Such systems arise, for example, from the numerical solution of linear complementarity problem, which is useful to model several learning and optimization problems. In this letter, we propose an effective damped Newton method, PLS-DN, to find the exact (up to machine precision) solution of nondegenerate PLSs. PLS-DN exhibits provable semiiterative property, that is, the algorithm converges globally to the exact solution in a finite number of iterations. The rate of convergence is shown to be at least linear before termination. We emphasize the applications of our method in modeling, from a novel perspective of PLSs, some statistical learning problems such as box-constrained least squares, elitist Lasso (Kowalski & Torreesani, 2008), and support vector machines (Cortes & Vapnik, 1995). Numerical results on synthetic and benchmark data sets are presented to demonstrate the effectiveness and efficiency of PLS-DN on these problems.