The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
A Modified Finite Newton Method for Fast Solution of Large Scale Linear SVMs
The Journal of Machine Learning Research
Fast Kernel Classifiers with Online and Active Learning
The Journal of Machine Learning Research
Fast support vector machines for continuous data
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on cybernetics and cognitive informatics
Hi-index | 0.01 |
We propose a novel and fast algorithm to train support vector machines (SVMs) in primal space, which solves an approximate optimization of SVMs with the properties of unconstraint, continuity and twice differentiability by utilizing the Newton optimization technique. Further, we devise a special pre-extracting procedure to speed up the convergence of the algorithm by resorting to a high-quality initial solution. Theoretical studies show that the proposed algorithm produces an @?-approximate solution to standard SVMs and maintains low computational complexity. Experimental results on benchmark data sets demonstrate that our algorithm is much faster than the dual based method such as SVM^l^i^g^h^t while it achieves the similar test accuracy.