The nature of statistical learning theory
The nature of statistical learning theory
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Supernetworks: Decision-Making for the Information Age
Supernetworks: Decision-Making for the Information Age
Convergence of a Generalized SMO Algorithm for SVM Classifier Design
Machine Learning
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
A Kind of Approximately Linear Support Vector Machine Based on Variational Inequality
PACIIA '08 Proceedings of the 2008 IEEE Pacific-Asia Workshop on Computational Intelligence and Industrial Application - Volume 01
Hi-index | 0.01 |
In order to decrease computational complexity and increase the speed of computerized implementation algorithm while solving quadratic programming problems, this paper puts forward and presents experimental results for an effective training method of Linear support vector machine based on variational inequality (VILSVM). The method is to transform the convex quadratic programming problem into the solving problem of variational inequality during the training process of linear supporting vector, obtaining the optimal separating hyperplane by means of solving problem of variational inequality. During the solving process, it will not generate high-memory data, so that the training and test speed of supporting vector machine in classification could be increased. The transformation formula and the specific algorithm were given in this paper. VILSVM was applied into the multidimensional iris training samples. The simulation result shows that VILSVM has high generalization ability and can identify accurately test sample. In addition, it has faster rate of convergence than traditional supporting vector machine with 88% time reduction averagely.