The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Adaptive filter theory (3rd ed.)
Adaptive filter theory (3rd ed.)
Detecting Concept Drift with Support Vector Machines
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Neural Computation
A support vector machine with forgetting factor and its statistical properties
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
A set of algorithms linking NLMS and block RLS algorithms
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
Introducing a forgetting factor allows a support vector machine to solve time-varying problems adaptively. However, the exponential forgetting factor proposed in an earlier work does not ensure convergence of average generalization error even for a simple linearly separable problem. To guarantee convergence, we propose a factorial forgetting factor which decays factorially over time. We approximately derive the average generalization error of the factorial forgetting factor as well as that of the exponential forgetting factor using a simple one-dimensional problem, and confirm our theory by computer simulations. Finally, we show that our theory can be extended to arbitrary types of forgetting factors for simple linearly separable cases.