LAPACK's user's guide
The nature of statistical learning theory
The nature of statistical learning theory
Matrix computations (3rd ed.)
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
An updated set of basic linear algebra subprograms (BLAS)
ACM Transactions on Mathematical Software (TOMS)
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Interior-Point Methods for Massive Support Vector Machines
SIAM Journal on Optimization
Object-oriented software for quadratic programming
ACM Transactions on Mathematical Software (TOMS)
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Lagrangian support vector machines
The Journal of Machine Learning Research
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Fast SVM Training Algorithm with Decomposition on Very Large Data Sets
IEEE Transactions on Pattern Analysis and Machine Intelligence
Predictive low-rank decomposition for kernel methods
ICML '05 Proceedings of the 22nd international conference on Machine learning
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
On the Nyström Method for Approximating a Gram Matrix for Improved Kernel-Based Learning
The Journal of Machine Learning Research
Maximum-Gain Working Set Selection for SVMs
The Journal of Machine Learning Research
An Efficient Implementation of an Active Set Method for SVMs
The Journal of Machine Learning Research
Hi-index | 0.00 |
This paper concerns the generation of support vector machine classifiers for solving the pattern recognition problem in machine learning. A method is proposed based on interior-point methods for convex quadratic programming. This interior-point method uses a linear preconditioned conjugate gradient method with a novel preconditioner to compute each iteration from the previous. An implementation is developed by adapting the object-oriented package OOQP to the problem structure. Numerical results are provided, and computational experience is discussed.