Private vs. common random bits in communication complexity
Information Processing Letters
Elements of information theory
Elements of information theory
On the distributional complexity of disjointness
Theoretical Computer Science
Computing with Noisy Information
SIAM Journal on Computing
Amortized Communication Complexity
SIAM Journal on Computing
Communication complexity
SIAM Journal on Computing
Informational Complexity and the Direct Sum Problem for Simultaneous Message Complexity
FOCS '01 Proceedings of the 42nd IEEE symposium on Foundations of Computer Science
An information statistics approach to data stream and communication complexity
Journal of Computer and System Sciences - Special issue on FOCS 2002
Towards proving strong direct product theorems
Computational Complexity
The Communication Complexity of Correlation
CCC '07 Proceedings of the Twenty-Second Annual IEEE Conference on Computational Complexity
Theory and application of trapdoor functions
SFCS '82 Proceedings of the 23rd Annual Symposium on Foundations of Computer Science
A direct sum theorem in communication complexity via message compression
ICALP'03 Proceedings of the 30th international conference on Automata, languages and programming
A strong direct product theorem for disjointness
Proceedings of the forty-second ACM symposium on Theory of computing
A strong direct product theorem for disjointness
Proceedings of the forty-second ACM symposium on Theory of computing
Strong direct product theorems for quantum communication and query complexity
Proceedings of the forty-third annual ACM symposium on Theory of computing
Lower bounds for number-in-hand multiparty communication complexity, made easy
Proceedings of the twenty-third annual ACM-SIAM symposium on Discrete Algorithms
Optimal bounds for Johnson-Lindenstrauss transforms and streaming problems with sub-constant error
Proceedings of the twenty-second annual ACM-SIAM symposium on Discrete Algorithms
Interactive information complexity
STOC '12 Proceedings of the forty-fourth annual ACM symposium on Theory of computing
Tight bounds for distributed functional monitoring
STOC '12 Proceedings of the forty-fourth annual ACM symposium on Theory of computing
Lower bounds in differential privacy
TCC'12 Proceedings of the 9th international conference on Theory of Cryptography
Space-bounded communication complexity
Proceedings of the 4th conference on Innovations in Theoretical Computer Science
Optimal Bounds for Johnson-Lindenstrauss Transforms and Streaming Problems with Subconstant Error
ACM Transactions on Algorithms (TALG) - Special Issue on SODA'11
From information to exact communication
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
An information complexity approach to extended formulations
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Approximation resistance from pairwise independent subgroups
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Direct product via round-preserving compression
ICALP'13 Proceedings of the 40th international conference on Automata, Languages, and Programming - Volume Part I
Limits of random oracles in secure computation
Proceedings of the 5th conference on Innovations in theoretical computer science
Direct sum fails for zero error average communication
Proceedings of the 5th conference on Innovations in theoretical computer science
ACM Transactions on Computation Theory (TOCT)
Choosing, Agreeing, and Eliminating in Communication Complexity
Computational Complexity
Hi-index | 0.03 |
We describe new ways to simulate 2-party communication protocols to get protocols with potentially smaller communication. We show that every communication protocol that communicates C bits and reveals I bits of information about the inputs to the participating parties can be simulated by a new protocol involving at most ~O(√CI) bits of communication. If the protocol reveals I bits of information about the inputs to an observer that watches the communication in the protocol, we show how to carry out the simulation with ~O(I) bits of communication. These results lead to a direct sum theorem for randomized communication complexity. Ignoring polylogarithmic factors, we show that for worst case computation, computing n copies of a function requires √n times the communication required for computing one copy of the function. For average case complexity, given any distribution μ on inputs, computing n copies of the function on n inputs sampled independently according to μ requires √n times the communication for computing one copy. If μ is a product distribution, computing n copies on n independent inputs sampled according to μ requires n times the communication required for computing the function. We also study the complexity of computing the sum (or parity) of n evaluations of f, and obtain results analogous to those above.