Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
The Speed Prior: A New Simplicity Measure Yielding Near-Optimal Computable Predictions
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Optimal Ordered Problem Solver
Machine Learning
A coding theorem for enumerable output machines
Information Processing Letters
Algorithmic complexity bounds on future prediction errors
Information and Computation
On generalized computable universal priors and their convergence
Theoretical Computer Science - Algorithmic learning theory
On semimeasures predicting Martin-Löf random sequences
Theoretical Computer Science
Anticipatory Behavior in Adaptive Learning Systems
The computational status of physics
Natural Computing: an international journal
Sequential predictions based on algorithmic complexity
Journal of Computer and System Sciences
Prefix-Like complexities and computability in the limit
CiE'06 Proceedings of the Second conference on Computability in Europe: logical Approaches to Computational Barriers
Monotone conditional complexity bounds on future prediction errors
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
We make the plausible assumption that the history of our universe is formally describable, and sampled from a formally describable probability distribution on the possible universe histories. To study the dramatic consequences for observers evolving within such a universe, we generalize the concepts of decidability, halting problem, Kolmogorov''s algorithmic complexity, and Solomonoff''s algorithmic probability. We describe objects more random than Chaitin''s halting probability of a Turing machine, show that there is a universal cumulatively enumerable measure (CEM) that dominates previous measures for inductive inference, prove that any CEM must assign low probabilities to universes without short enumerating programs, that any describable measure must assign low probabilities to universes without short descriptions, and several similar "Occam''s razor theorems." Then we discuss the most efficient way of computing all universes based on Levin''s optimal search algorithm, and make a natural resource-oriented postulate: the cumulative prior probability of all objects incomputable within time t by this optimal algorithm should be inversely proportional to t. We derive consequences for inductive inference, physics, and philosophy, predicting that whatever seems random is not, but in fact is computed by a short and fast algorithm which will probably halt before our universe is many times older than it is now.