A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Lagrangian support vector machines
The Journal of Machine Learning Research
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Successive overrelaxation for support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support Vector Machines are finding application in pattern recognition, regression estimation, and operator inversion. To extend the using range, people have always been trying their best in finding online algorithms. But the Support Vector Machines are sensitive only to the extreme values and not to the distribution of the whole data. Ordinary algorithm can not predict which value will be sensitive and has to deal with all the data once. This paper introduces an algorithm that selects promising vectors from given vectors. Whenever a new vector is added to the training data set, unnecessary vectors are found and deleted. So we could easily get an online algorithm. We give the reason we delete unnecessary vectors, provide the computing method to find them. At last, we provide an example to illustrate the validity of algorithm.