Time series: theory and methods
Time series: theory and methods
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Approximation and Estimation Bounds for Artificial Neural Networks
Machine Learning - Special issue on computational learning theory
Sphere packing numbers for subsets of the Boolean n-cube with bounded Vapnik-Chervonenkis dimension
Journal of Combinatorial Theory Series A
Bounding the Vapnik-Chervonenkis Dimension of Concept Classes Parameterized by Real Numbers
Machine Learning - Special issue on COLT '93
Fat-shattering and the learnability of real-valued functions
Journal of Computer and System Sciences
Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
Journal of Computer and System Sciences - Special issue: dedicated to the memory of Paris Kanellakis
Learning dynamical systems in a stationary environment
Systems & Control Letters - Special issue: learning theory
Almost linear VC-dimension bounds for piecewise polynomial networks
Neural Computation
A Theory of Learning and Generalization
A Theory of Learning and Generalization
On-line Algorithms in Machine Learning
Developments from a June 1996 seminar on Online algorithms: the state of the art
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Neural networks for optimal approximation of smooth and analytic functions
Neural Computation
Concept learning using complexity regularization
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Efficient agnostic learning of neural networks with bounded fan-in
IEEE Transactions on Information Theory - Part 2
Memory-universal prediction of stationary random processes
IEEE Transactions on Information Theory
Error bounds for functional approximation and estimation using mixtures of experts
IEEE Transactions on Information Theory
Nonparametric estimation via empirical risk minimization
IEEE Transactions on Information Theory
Unsupervised slow subspace-learning from stationary processes
Theoretical Computer Science
Learning from dependent observations
Journal of Multivariate Analysis
Joint universal lossy coding and identification of stationary mixing sources with general alphabets
IEEE Transactions on Information Theory
Stability Bounds for Stationary φ-mixing and β-mixing Processes
The Journal of Machine Learning Research
Unsupervised slow subspace-learning from stationary processes
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Unexpected challenges in large scale machine learning
Proceedings of the 1st International Workshop on Big Data, Streams and Heterogeneous Source Mining: Algorithms, Systems, Programming Models and Applications
Finite-sample analysis of least-squares policy iteration
The Journal of Machine Learning Research
Hi-index | 0.06 |
We consider the problem of one-step ahead prediction for time seriesgenerated by an underlying stationary stochastic process obeying thecondition of absolute regularity, describing the mixing nature of process.We make use of recent results from the theory of empirical processes, andadapt the uniform convergence framework of Vapnik and Chervonenkis to theproblem of time series prediction,obtaining finite sample bounds. Furthermore, by allowing boththe model complexity and memory size tobe adaptively determined by the data, we derive nonparametric rates ofconvergence through an extension of the method of structural riskminimization suggested by Vapnik. All our results arederived for general L error measures, andapply to bothexponentially and algebraically mixing processes.