The nature of statistical learning theory
The nature of statistical learning theory
On piecewise quadratic Newton and trust region problems
Mathematical Programming: Series A and B - Special issue on computational nonsmooth optimization
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
SSVM: A Smooth Support Vector Machine for Classification
Computational Optimization and Applications
Lagrangian support vector machines
The Journal of Machine Learning Research
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Neural Computation
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Successive overrelaxation for support vector machines
IEEE Transactions on Neural Networks
On the convergence of the decomposition method for support vector machines
IEEE Transactions on Neural Networks
Asymptotic convergence of an SMO algorithm without any assumptions
IEEE Transactions on Neural Networks
Hi-index | 0.10 |
Support vector machines can be posed as quadratic programming problems in a variety of ways. This paper investigates the 2-norm soft margin SVM with an additional quadratic penalty for the bias term that leads to a positive definite quadratic program in feature space only with the nonnegative constraint. An unconstrained programming problem is proposed as the Lagrangian dual of the quadratic programming for the linear classification problem. The resulted problem minimizes a differentiable convex piecewise quadratic function with lower dimensions in input space, and a Semismooth Newton algorithm is introduced to solve it quickly, then a Semismooth Newton Support Vector Machine (SNSVM) is presented. After the kernel matrix is factorized by the Cholesky factorization or the incomplete Cholesky factorization, the nonlinear kernel classification problem can also be solved by SNSVM, and the complexity of the algorithms has no apparent increase. Many numerical experiments demonstrate that our algorithm is comparable with the similar algorithms such as Lagrangian Support Vector Machines (LSVM) and Semismooth Support Vector Machines (SSVM).