Theory of recursive functions and effective computability
Theory of recursive functions and effective computability
Introduction to algorithms
COLT '90 Proceedings of the third annual workshop on Computational learning theory
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Journal of the ACM (JACM)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
A game of prediction with expert advice
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
Tight worst-case loss bounds for predicting with expert advice
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Genral Linear Relations among Different Types of Predictive Complexity
ALT '99 Proceedings of the 10th International Conference on Algorithmic Learning Theory
Non-linear Inequalities between Predictive and Kolmogorov Complexities
ALT '01 Proceedings of the 12th International Conference on Algorithmic Learning Theory
Predictive complexity and information
Journal of Computer and System Sciences - Special issue on COLT 2002
Predictive complexity and generalized entropy rate of stationary ergodic processes
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
A new notion of predictive complexity and corresponding amount of information are considered. Predictive complexity is a generalization of Kolmogorov complexity which bounds the ability of any algorithm to predict elements of a sequence of outcomes. We consider predictive complexity for a wide class of bounded loss functions which are generalizations of square-loss function. Relations between unconditional KG(x) and conditional KG(x|y) predictive complexities are studied. We define an algorithm which has some "expanding property". It transforms with positive probability sequences of given predictive complexity into sequences of essentially bigger predictive complexity. A concept of amount of predictive information IG(y : x) is studied. We show that this information is non-commutative in a very strong sense and present asymptotic relations between values IG(y : x), IG(x : y), KG(x) and KG(y).