Elements of information theory
Elements of information theory
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Universal Artificial Intelligence: Sequential Decisions Based On Algorithmic Probability
Universal Artificial Intelligence: Sequential Decisions Based On Algorithmic Probability
Complexity-based induction systems: Comparisons and convergence theorems
IEEE Transactions on Information Theory
Hi-index | 0.89 |
In order to generate a universal probability distribution to extrapolate a binary string x of length i, we feed random bits into a universal device, M. When we find an input string that gives an output matching x, we continue the successful input with random bits until M produces a zero or one as output. The relative probabilities of these two continuations can give a normalized prediction for the probability of the symbol following x. There is, however, a probability, P"i"+"1(u) that the continued random input string will not generate any output for the (i+1)th symbol. We will show E"@m@?"i"="1^nP"i(u)=