On asymptotically optimal methods of prediction and adaptive coding for Markov sources
Journal of Complexity
Experimental investigation of forecasting methods based on data compression algorithms
Problems of Information Transmission
Asymptotic confidence intervals for Poisson regression
Journal of Multivariate Analysis
Pattern Recognition for Conditionally Independent Data
The Journal of Machine Learning Research
A universal strong law of large numbers for conditional expectations via nearest neighbors
Journal of Multivariate Analysis
IEEE Transactions on Information Theory
Intermittent estimation for Gaussian processes
IEEE Transactions on Information Theory
Long-term prediction intervals of time series
IEEE Transactions on Information Theory
Hi-index | 755.02 |
Let {Xt} be a real-valued time series. The best nonlinear predictor of X0 given the infinite past X-∞-1 in the least squares sense, is equal to the conditional mean E{X0|X-∞-1}. Previously, it has been shown that certain predictors based on growing segments of past observations converge to the best predictor given the infinite past whenever {Xt} is a stationary process with values in a bounded interval. The present paper deals with universal prediction schemes for stationary processes with finite mean. We also discuss universal schemes for learning the conditional mean E{X0|X -∞-1Y-∞-1Y0 } from past observations of a stationary pair process {(Xt , Yt)}, and schemes for learning the repression function m(y)=E{X|Y=y} from independent samples of (X, Y)