Kolmogorov complexity and Hausdorff dimension
Information and Computation
The complexity and effectiveness of prediction algorithms
Journal of Complexity
Journal of the ACM (JACM)
A game of prediction with expert advice
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
Gales and the Constructive Dimension of Individual Sequences
ICALP '00 Proceedings of the 27th International Colloquium on Automata, Languages and Programming
Dimension in Complexity Classes
COCO '00 Proceedings of the 15th Annual IEEE Conference on Computational Complexity
Hausdorff Dimension in Exponential Time
CCC '01 Proceedings of the 16th Annual Conference on Computational Complexity
Note: fractal dimension and logarithmic loss unpredictability
Theoretical Computer Science
The dimensions of individual strings and sequences
Information and Computation
IEEE Transactions on Information Theory
Gambling using a finite state machine
IEEE Transactions on Information Theory
Universal prediction of individual sequences
IEEE Transactions on Information Theory
Scaled dimension and nonuniform complexity
Journal of Computer and System Sciences
SIGACT news complexity theory column 48
ACM SIGACT News
Dimension, entropy rates, and compression
Journal of Computer and System Sciences
The arithmetical complexity of dimension and randomness
ACM Transactions on Computational Logic (TOCL)
Theoretical Computer Science
Generic density and small span theorem
Information and Computation
Martingale families and dimension in P
Theoretical Computer Science
Generalised entropy and asymptotic complexities of languages
COLT'07 Proceedings of the 20th annual conference on Learning theory
Martingale families and dimension in p
CiE'06 Proceedings of the Second conference on Computability in Europe: logical Approaches to Computational Barriers
Generic density and small span theorem
FCT'05 Proceedings of the 15th international conference on Fundamentals of Computation Theory
Predictive complexity and generalized entropy rate of stationary ergodic processes
ALT'12 Proceedings of the 23rd international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
Given a set X of sequences over a finite alphabet, we investigate the following three quantities.(i)The feasible predictability of X is the highest success ratio that a polynomial-time randomized predictor can achieve on all sequences in X. (ii)The deterministic feasible predictability of X is the highest success ratio that a polynomial-time deterministic predictor can achieve on all sequences in X. (iii)The feasible dimension of X is the polynomial-time effectivization of the classical Hausdorff dimension (''fractal dimension'') of X. Predictability is known to be stable in the sense that the feasible predictability of X@?Y is always the minimum of the feasible predictabilities of X and Y. We show that deterministic predictability also has this property if X and Y are computably presentable. We show that deterministic predictability coincides with predictability on singleton sets. Our main theorem states that the feasible dimension of X is bounded above by the maximum entropy of the predictability of X and bounded below by the segmented self-information of the predictability of X, and that these bounds are tight.