An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
On the Length of Programs for Computing Finite Binary Sequences: statistical considerations
Journal of the ACM (JACM)
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
Convergence and Error Bounds for Universal Prediction of Nonbinary Sequences
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
General Loss Bounds for Universal Sequence Prediction
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
A Computer Scientist's View of Life, the Universe, and Everything
Foundations of Computer Science: Potential - Theory - Cognition, to Wilfried Brauer on the occasion of his sixtieth birthday
Algorithmic Theories of Everything
Algorithmic Theories of Everything
The Fastest and Shortest Algorithm for All Well-Defined Problems
The Fastest and Shortest Algorithm for All Well-Defined Problems
Self-Optimizing and Pareto-Optimal Policies in General Environments Based on Bayes-Mixtures
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Optimality of universal Bayesian sequence prediction for general loss and alphabet
The Journal of Machine Learning Research
Optimal Ordered Problem Solver
Machine Learning
Algorithmic complexity bounds on future prediction errors
Information and Computation
Universal Intelligence: A Definition of Machine Intelligence
Minds and Machines
A computational approximation to the AIXI model
Proceedings of the 2008 conference on Artificial General Intelligence 2008: Proceedings of the First AGI Conference
Adversarial Sequence Prediction
Proceedings of the 2008 conference on Artificial General Intelligence 2008: Proceedings of the First AGI Conference
Anticipatory Behavior in Adaptive Learning Systems
Cross-domain knowledge transfer using structured representations
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
Journal of Artificial Intelligence Research
A universal measure of intelligence for artificial agents
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Sequential predictions based on algorithmic complexity
Journal of Computer and System Sciences
DS'07 Proceedings of the 10th international conference on Discovery science
2006: celebrating 75 years of AI - history and outlook: the next 25 years
50 years of artificial intelligence
Measuring universal intelligence: Towards an anytime intelligence test
Artificial Intelligence
Optimality issues of universal greedy agents with static priors
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
Completely self-referential optimal reinforcement learners
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
A Monte-Carlo AIXI approximation
Journal of Artificial Intelligence Research
Compression and learning in linear regression
ISMIS'11 Proceedings of the 19th international conference on Foundations of intelligent systems
On more realistic environment distributions for defining, evaluating and developing intelligence
AGI'11 Proceedings of the 4th international conference on Artificial general intelligence
Measuring agent intelligence via hierarchies of environments
AGI'11 Proceedings of the 4th international conference on Artificial general intelligence
Real-world limits to algorithmic intelligence
AGI'11 Proceedings of the 4th international conference on Artificial general intelligence
Monotone conditional complexity bounds on future prediction errors
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
An architecture for self-organising evolvable virtual machines
Engineering Self-Organising Systems
Asymptotic non-learnability of universal agents with computable horizon functions
Theoretical Computer Science
Hi-index | 0.00 |
Solomonoff's optimal but noncomputable method for inductive inference assumes that observation sequences x are drawn from an recursive prior distribution 碌(x). Instead of using the unknown 碌(x) he predicts using the celebrated universal enumerable prior M(x) which for all x exceeds any recursive 碌(x), save for a constant factor independent of x. The simplicity measure M(x) naturally implements "Occam's razor" and is closely related to the Kolmogorov complexity of x. However, M assigns high probability to certain data x that are extremely hard to compute. This does not match our intuitive notion of simplicity. Here we suggest a more plausible measure derived from the fastest way of computing data. In absence of contrarian evidence, we assume that the physical world is generated by a computational process, and that any possibly infinite sequence of observations is therefore computable in the limit (this assumption is more radical and stronger than Solomonoff's). Then we replace M by the novel Speed Prior S, under which the cumulative a priori probability of all data whose computation through an optimal algorithm requires more than O(n) resources is 1/n. We show that the Speed Prior allows for deriving a computable strategy for optimal prediction of future y, given past x. Then we consider the case that the data actually stem from a nonoptimal, unknown computational process, and use Hutter's recent results to derive excellent expected loss bounds for S-based inductive inference. We conclude with several nontraditional predictions concerning the future of our universe.