The nature of statistical learning theory
The nature of statistical learning theory
An Incremental Feature Learning Algorithm Based on Least Square Support Vector Machine
FAW '08 Proceedings of the 2nd annual international workshop on Frontiers in Algorithmics
An online incremental learning support vector machine for large-scale data
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
Incremental training of support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper we proposed a novel approach for incremental support vector machine training. The original problem of SVM is a quadratic programming(QP) problem, the result of which reduces to a linear combination of training examples. This result inspires us that SVM can be viewed as a two layer neural network, the structure of the first layer of which is determined by the kernel function chosen and the training examples, and what remains mutable is coefficients and bias of the second. In our method we train the weights of support vectors and bias using the same stochastic gradient descent algorithm as perceptron training. In contrast with perceptron training, we picked the hinge loss function rather than the square of errors as the target function, since in hinge loss function correctly classified training examples has no effect on the decision surface.