Logical depth and physical complexity
A half-century survey on The Universal Turing Machine
Elements of information theory
Elements of information theory
A new challenge for compression algorithms: genetic sequences
Information Processing and Management: an International Journal - Special issue: data compression
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
On the Length of Programs for Computing Finite Binary Sequences
Journal of the ACM (JACM)
A Theory of Program Size Formally Identical to Information Theory
Journal of the ACM (JACM)
A compression algorithm for DNA sequences and its applications in genome comparison
RECOMB '00 Proceedings of the fourth annual international conference on Computational molecular biology
Quantum computation and quantum information
Quantum computation and quantum information
Journal of Logic, Language and Information
Kernel Methods for Measuring Independence
The Journal of Machine Learning Research
The Minimum Description Length Principle (Adaptive Computation and Machine Learning)
The Minimum Description Length Principle (Adaptive Computation and Machine Learning)
On universal transfer learning
Theoretical Computer Science
Detecting the direction of causal time series
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Strong completeness and faithfulness in Bayesian networks
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Quasi-order of clocks and their synchronism and quantum bounds for copying timing information
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Information-geometric approach to inferring causal directions
Artificial Intelligence
Replacing Causal Faithfulness with Algorithmic Independence of Conditionals
Minds and Machines
Hi-index | 754.84 |
Inferring the causal structure that links n observables is usually based upon detecting statistical dependences and choosing simple graphs that make the joint measure Markovian. Here we argue why causal inference is also possible when the sample size is one. We develop a theory how to generate causal graphs explaining similarities between single objects. To this end, we replace the notion of conditional stochastic independence in the causal Markov condition with the vanishing of conditional algorithmic mutual information and describe the corresponding causal inference rules. We explain why a consistent reformulation of causal inference in terms of algorithmic complexity implies a new inference principle that takes into account also the complexity of conditional probability densities, making it possible to select among Markov equivalent causal graphs. This insight provides a theoretical foundation of a heuristic principle proposed in earlier work. We also sketch some ideas on how to replace Kolmogorov complexity with decidable complexity criteria. This can be seen as an algorithmic analog of replacing the empirically undecidable question of statistical independence with practical independence tests that are based on implicit or explicit assumptions on the underlying distribution.