Randomness conservation inequalities; information and independence in mathematical theories
Information and Control
Stochastic systems: estimation, identification and adaptive control
Stochastic systems: estimation, identification and adaptive control
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
A Theory of Program Size Formally Identical to Information Theory
Journal of the ACM (JACM)
New error bounds for Solomonoff prediction
Journal of Computer and System Sciences
The Speed Prior: A New Simplicity Measure Yielding Near-Optimal Computable Predictions
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
On minimal-program complexity measures
STOC '69 Proceedings of the first annual ACM symposium on Theory of computing
Algorithmic Theories of Everything
Algorithmic Theories of Everything
Universal Artificial Intelligence: Sequential Decisions Based On Algorithmic Probability
Universal Artificial Intelligence: Sequential Decisions Based On Algorithmic Probability
Process complexity and effective random tests
Journal of Computer and System Sciences
Minimum description length induction, Bayesianism, and Kolmogorov complexity
IEEE Transactions on Information Theory
Complexity-based induction systems: Comparisons and convergence theorems
IEEE Transactions on Information Theory
Convergence and loss bounds for Bayesian sequence prediction
IEEE Transactions on Information Theory
Minimum complexity density estimation
IEEE Transactions on Information Theory
MDL convergence speed for Bernoulli sequences
Statistics and Computing
Algorithmic complexity bounds on future prediction errors
Information and Computation
Hi-index | 0.00 |
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km=-logm, i.e. based on universal deterministic/one-part MDL. m is extremely close to Solomonoff's universal prior M, the latter being an excellent predictor in deterministic as well as probabilistic environments, where performance is measured in terms of convergence of posteriors or losses. Despite this closeness to M, it is difficult to assess the prediction quality of m, since little is known about the closeness of their posteriors, which are the important quantities for prediction. We show that for deterministic computable environments, the ''posterior'' and losses of m converge, but rapid convergence could only be shown on-sequence; the off-sequence convergence can be slow. In probabilistic environments, neither the posterior nor the losses converge, in general.