An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Kolmogorov entropy in the context of computability theory
Theoretical Computer Science
Algorithmic Theories of Everything
Algorithmic Theories of Everything
Convergence and Loss Bounds for Bayesian Sequence Prediction
Convergence and Loss Bounds for Bayesian Sequence Prediction
Universal Artificial Intelligence: Sequential Decisions Based On Algorithmic Probability
Universal Artificial Intelligence: Sequential Decisions Based On Algorithmic Probability
Minimum description length induction, Bayesianism, and Kolmogorov complexity
IEEE Transactions on Information Theory
Complexity-based induction systems: Comparisons and convergence theorems
IEEE Transactions on Information Theory
On the ideal convergence of sequences of fuzzy numbers
Information Sciences: an International Journal
Universal prediction of selected bits
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
Hi-index | 5.23 |
Solomonoff's central result on induction is that the prediction of a universal semimeasure M converges rapidly and with probability 1 to the true sequence generating predictor μ, if the latter is computable. Hence, M is eligible as a universal sequence predictor in the case of unknown μ. Despite some nearby results and proofs in the literature, the stronger result of convergence for all (Martin-Löf) random sequences remained open. Such a convergence result would be particularly interesting and natural, since randomness can be defined in terms of M itself. We show that there are universal semimeasures M which do not converge to μ on all μ-random sequences, i.e. we give a partial negative answer to the open problem. We also provide a positive answer for some non-universal semimeasures. We define the incomputable measure D as a mixture over all computable measures and the enumerable semimeasure W as a mixture over all enumerable nearly measures. We show that W converges to D and D to μ on all random sequences. The Hellinger distance measuring closeness of two distributions plays a central role.