Prequential and Cross-Validated Regression Estimation
Machine Learning
Model Selection and Error Estimation
Machine Learning
Exploiting random walks for learning
Information and Computation
The performance bounds of learning machines based on exponentially strongly mixing sequences
Computers & Mathematics with Applications
Rate of convergence in density estimation using neural networks
Neural Computation
Learning from uniformly ergodic Markov chains
Journal of Complexity
Learning Performance of Tikhonov Regularization Algorithm with Strongly Mixing Samples
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
Generalization performance of ERM algorithm with geometrically ergodic Markov chain samples
ICNC'09 Proceedings of the 5th international conference on Natural computation
Complexity-penalized estimation of minimum volume sets for dependent data
Journal of Multivariate Analysis
Generalization bounds of ERM algorithm with V-geometrically Ergodic Markov chains
Advances in Computational Mathematics
Brief Finite sample properties of system identification of ARX models under mixing conditions
Automatica (Journal of IFAC)
Classification with non-i.i.d. sampling
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.00 |
The minimum complexity regression estimation framework (Barron, 1991; Barron and Cover, 1991 and Rissanen, 1989) is a general data-driven methodology for estimating a regression function from a given list of parametric models using independent and identically distributed (i.i.d.) observations. We extend Barron's regression estimation framework to m-dependent observations and to strongly mixing observations. In particular, we propose abstract minimum complexity regression estimators for dependent observations, which may be adapted to a particular list of parametric models, and establish upper bounds on the statistical risks of the proposed estimators in terms of certain deterministic indices of resolvability. Assuming that the regression function satisfies a certain Fourier-transform-type representation, we examine minimum complexity regression estimators adapted to a list of parametric models based on neural networks and by using the upper bounds for the abstract estimators, we establish rates of convergence for the statistical risks of these estimators. Also, as a key tool, we extend the classical Bernstein inequality from i.i.d. random variables to m-dependent processes and to strongly mixing processes