Iterative methods for large convex quadratic programs: a survey
SIAM Journal on Control and Optimization
Global convergence of a class of trust region algorithms for optimization with simple bounds
SIAM Journal on Numerical Analysis
An algorithm for a singly constrained class of quadratic programs subject to upper and lower bounds
Mathematical Programming: Series A and B
On the linear convergence of descent methods for convex essentially smooth minimization
SIAM Journal on Control and Optimization
Barrier-projective methods for non-linear programming
Computational Mathematics and Mathematical Physics
The nature of statistical learning theory
The nature of statistical learning theory
Matrix computations (3rd ed.)
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Gradient Method with Retards and Generalizations
SIAM Journal on Numerical Analysis
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
SIAM Journal on Optimization
Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
Interior-Point Methods for Massive Support Vector Machines
SIAM Journal on Optimization
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Predictive low-rank decomposition for kernel methods
ICML '05 Proceedings of the 22nd international conference on Machine learning
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Neural Computation
New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
Mathematical Programming: Series A and B
A geometric approach to Support Vector Machine (SVM) classification
IEEE Transactions on Neural Networks
Hi-index | 12.05 |
This research proposes a solving approach for the @n-support vector machine (SVM) for classification problems using the modified matrix splitting method and incomplete Cholesky decomposition. With a minor modification, the dual formulation of the @n-SVM classification becomes a singly linearly constrained convex quadratic program with box constraints. The Kernel Hessian matrix of the SVM problem is dense and large. The matrix splitting method combined with the projection gradient method solves the subproblem with a diagonal Hessian matrix iteratively until the solution reaches the optimum. The method can use one of several line search and updating alpha methods in the projection gradient method. The incomplete Cholesky decomposition is used for the calculation of the large scale Hessian and vectors. The newly proposed method applies for a real world classification problem of the credit prediction for small-sized Korean companies.