Theoretical Computer Science - Special issue Kolmogorov complexity
Inequalities for Shannon entropy and Kolmogorov complexity
Journal of Computer and System Sciences - Eleventh annual conference on computational learning theory&slash;Twelfth Annual IEEE conference on computational complexity
Upper semi-lattice of binary strings with the relation "x is simple conditional to y"
Theoretical Computer Science
Conditional complexity and codes
Theoretical Computer Science
Combinatorial interpretation of Kolmogorov complexity
Theoretical Computer Science
A Criterion for Extractability of Mutual Information for a Triple of Strings
Problems of Information Transmission
An Introduction to Kolmogorov Complexity and Its Applications
An Introduction to Kolmogorov Complexity and Its Applications
Appendix: on common information and related characteristics of correlated information sources
General Theory of Information Transfer and Combinatorics
On characterization of entropy function via information inequalities
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Assume that a tuple of binary strings $$ \bar a $$ = 驴a 1 ..., a n 驴 has negligible mutual information with another string b. Does this mean that properties of the Kolmogorov complexity of $$ \bar a $$ do not change significantly if we relativize them to b? This question becomes very nontrivial when we try to formalize it. In this paper we investigate this problem for a special class of properties (for properties that can be expressed by an 驴-formula). In particular, we show that a random (conditional on $$ \bar a $$ ) oracle b does not help to extract common information from the strings a i .