Combinatorics: set systems, hypergraphs, families of vectors, and combinatorial probability
Combinatorics: set systems, hypergraphs, families of vectors, and combinatorial probability
Elements of information theory
Elements of information theory
Communication complexity
Information Theory and Reliable Communication
Information Theory and Reliable Communication
Informational Complexity and the Direct Sum Problem for Simultaneous Message Complexity
FOCS '01 Proceedings of the 42nd IEEE symposium on Foundations of Computer Science
A Lower Bound for the Bounded Round Quantum Communication Complexity of Set Disjointness
FOCS '03 Proceedings of the 44th Annual IEEE Symposium on Foundations of Computer Science
An information statistics approach to data stream and communication complexity
Journal of Computer and System Sciences - Special issue on FOCS 2002
An Optimal Randomised Cell Probe Lower Bound for Approximate Nearest Neighbour Searching
FOCS '04 Proceedings of the 45th Annual IEEE Symposium on Foundations of Computer Science
Prior Entanglement, Message Compression and Privacy in Quantum Communication
CCC '05 Proceedings of the 20th Annual IEEE Conference on Computational Complexity
The Communication Complexity of Correlation
CCC '07 Proceedings of the Twenty-Second Annual IEEE Conference on Computational Complexity
Probabilistic computations: Toward a unified measure of complexity
SFCS '77 Proceedings of the 18th Annual Symposium on Foundations of Computer Science
An Introduction to Kolmogorov Complexity and Its Applications
An Introduction to Kolmogorov Complexity and Its Applications
A property of quantum relative entropy with an application to privacy in quantum communication
Journal of the ACM (JACM)
A direct sum theorem in communication complexity via message compression
ICALP'03 Proceedings of the 30th international conference on Automata, languages and programming
Communication complexity of remote state preparation with entanglement
Quantum Information & Computation
Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem
IEEE Transactions on Information Theory
Proceedings of the 3rd Innovations in Theoretical Computer Science Conference
Deterministic compression with uncertain priors
Proceedings of the 5th conference on Innovations in theoretical computer science
Hi-index | 754.84 |
Let X and Y be finite nonempty sets and (X,Y) a pair of random variables taking values in X × Y. We consider communication protocols between two parties, ALICE and BOB, for generating X and Y . ALICE is provided an x ∈ X generated according to the distribution of y, and is required to send a message to BOB in order to enable him to generate y ∈ Y, whose distribution is the same as that of Y|X=x. Both parties have access to a shared random string generated in advance. Let T[X : Y] be the minimum (over all protocols) of the expected number of bits ALICE needs to transmit to achieve this. We show that I[X : Y] ≤ T[X : Y] ≤ I[X : Y] + 2log2(I[X : Y] + 1) + O(1). We also consider the worst case communication required for this problem, where we seek to minimize the average number of bits ALICE must transmit for the worst case x ∈ X. We show that the communication required in this case is related to the capacity C(E) of the channel E, derived from (X,Y), that maps x ∈ X to the distribution of Y|X=x. We also showthat the required communication T(E) satisfies C(E) ≤ T(E) ≤ C(E) + 2log2 (C(E) + 1) + O(1). Using the first result, we derive a direct-sum theorem in communication complexity that substantially improves the previous such result shown by Jain, Radhakrishnan, and Sen [In Proc. 30th International Colloquium of Automata, Languages and Programming (ICALP), ser. Lecture Notes in Computer Science, vol. 2719. 2003, pp. 300-315]. These results are obtained by employing a rejection sampling procedure that relates the relative entropy between two distributions to the communication complexity of generating one distribution from the other.