Randomness conservation inequalities; information and independence in mathematical theories
Information and Control
Inductive reasoning and Kolmogorov complexity
Journal of Computer and System Sciences
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
The discovery of algorithmic probability
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Inductive Inference: Theory and Methods
ACM Computing Surveys (CSUR)
New error bounds for Solomonoff prediction
Journal of Computer and System Sciences
Recursively Enumerable Reals and Chaitin Omega Numbers
STACS '98 Proceedings of the 15th Annual Symposium on Theoretical Aspects of Computer Science
Algorithmic Theories of Everything
Algorithmic Theories of Everything
Optimality of Universal Bayesian Sequence Prediction for General Loss and Alphabet
Optimality of Universal Bayesian Sequence Prediction for General Loss and Alphabet
Towards a Universal Theory of Artificial Intelligence based on Algorithmic Probability and Sequential Decision Theory
Minimum description length induction, Bayesianism, and Kolmogorov complexity
IEEE Transactions on Information Theory
The Speed Prior: A New Simplicity Measure Yielding Near-Optimal Computable Predictions
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Monotone conditional complexity bounds on future prediction errors
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
Solomonoff's uncomputable universal prediction scheme ξ allows to predict the next symbol xk of a sequence x1...xk-1 for any Turing computable, but otherwise unknown, probabilistic environment µ. This scheme will be generalized to arbitrary environmental classes, which, among others, allows the construction of computable universal prediction schemes ξ. Convergence of ξ to µ in a conditional mean squared sense and with µ probability 1 is proven. It is shown that the average number of prediction errors made by the universal ξ scheme rapidly converges to those made by the best possible informed µ scheme. The schemes, theorems and proofs are given for general finite alphabet, which results in additional complications as compared to the binary case. Several extensions of the presented theory and results are outlined. They include general loss functions and bounds, games of chance, infinite alphabet, partial and delayed prediction, classification, and more active systems.