The nature of statistical learning theory
The nature of statistical learning theory
Scale-sensitive dimensions, uniform convergence, and learnability
Journal of the ACM (JACM)
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
The bounds on the rate of uniform convergence for learning machine
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Minimum complexity regression estimation with weakly dependent observations
IEEE Transactions on Information Theory - Part 2
IEEE Transactions on Information Theory
Learning from uniformly ergodic Markov chains
Journal of Complexity
Semi-supervised learning based on high density region estimation
Neural Networks
Generalization bounds of ERM algorithm with V-geometrically Ergodic Markov chains
Advances in Computational Mathematics
Approximation and estimation bounds for free knot splines
Computers & Mathematics with Applications
Hi-index | 0.09 |
Generalization performance is the main purpose of machine learning theoretical research. It has been shown previously by Vapnik, Cucker and Smale that the empirical risks based on an i.i.d. sequence must uniformly converge on their expected risks for learning machines as the number of samples approaches infinity. In order to study the generalization performance of learning machines under the condition of dependent input sequences, this paper extends these results to the case where the i.i.d. sequence is replaced by exponentially strongly mixing sequence. We obtain the bound on the rate of uniform convergence for learning machines by using Bernstein's inequality for exponentially strongly mixing sequences, and establishing the bound on the rate of relative uniform convergence for learning machines based on exponentially strongly mixing sequence. In the end, we compare these bounds with previous results.