The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Convergence of a Generalized SMO Algorithm for SVM Classifier Design
Machine Learning
Efficient computations for large least square support vector machine classifiers
Pattern Recognition Letters
Lagrangian support vector machines
The Journal of Machine Learning Research
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Successive overrelaxation for support vector machines
IEEE Transactions on Neural Networks
On the convergence of the decomposition method for support vector machines
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Real-time foreground-background segmentation using adaptive support vector machine algorithm
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
A heuristic weight-setting algorithm for robust weighted least squares support vector regression
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Hi-index | 0.00 |
Based on the incremental and decremental learning strategies, an adaptive support vector machine learning algorithm (ASVM) is presented for large classification problems in this paper. In the proposed algorithm, the incremental and decremental procedures are performed alternatively, and a small scale working set, which can cover most of the information in the training set and overcome the drawback of losing the sparseness in least squares support vector machine (LS-SVM), can be formed adaptively. The classifier can be constructed by using this working set. In general, the number of the elements in the working set is much smaller than that in the training set. Therefore the proposed algorithm can be used not only to train the data sets quickly but also to test them effectively with losing little accuracy. In order to examine the training speed and the generalization performance of the proposed algorithm, we apply both ASVM and LS-SVM to seven UCI datasets and a benchmark problem. Experimental results show that the novel algorithm is very faster than LS-SVM and loses little accuracy in solving large classification problems.