The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Adaptive filter theory (3rd ed.)
Adaptive filter theory (3rd ed.)
Advances in kernel methods: support vector learning
Advances in kernel methods: support vector learning
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Advances in Large Margin Classifiers
Advances in Large Margin Classifiers
Detecting Concept Drift with Support Vector Machines
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Neural Computation
Hi-index | 0.00 |
In order to make a support vector machine applicable to time-varying problems, a forgetting factor is introduced to its cost function, in the same way as the RLS algorithm for adaptive filters. The idea of the forgetting factor is simple but it is shown to drastically change the performance of SVMs by deriving the average generalization error in a simple case where input space is one-dimensional. The average generalization error does not converge to zero, differently from the SVM in batch or online. We confirmed our results by computer simulations.