Randomness conservation inequalities; information and independence in mathematical theories
Information and Control
COLT '90 Proceedings of the third annual workshop on Computational learning theory
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Non-stochastic infinite and finite sequences
Theoretical Computer Science - Special issue Kolmogorov complexity
A game of prediction with expert advice
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
Suboptimal measures of predictive complexity for absolute loss function
Information and Computation
Tight worst-case loss bounds for predicting with expert advice
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
On complexity of easy predictable sequences
Information and Computation
Hi-index | 0.00 |
The central problem in machine learning (and stastistics) is the problem of predicting future events xn+1 based on past observations x1x2...xn, where n = 1, 2, .... The main goal is to find a method of prediction that minimizes the total loss suffered on a sequence x1x2...xn+1 for n = 1,2 .... We say that a data sequence is stochastic if there exists a simply described prediction algorithm whose performance is close to the best possible one. This optimal performance is defined in terms of Bovk's predictive complexity, which is a generalization of the notion of Kolmogorov complexity. Predictive complexity gives a limit on the predictive performance of simply described prediction algorithms. In this paper we argue that data sequences normally occurring in the real world are stochastic; more formally, we prove that Levin's a priori semimeasure of nonstochastic sequences is small. Copyright 2001 Academic Press.