Kolmogorov Complexity and Combinatorial Methods in Communication Complexity

  • Authors:
  • Marc Kaplan;Sophie Laplante

  • Affiliations:
  • LRI, Université Paris-Sud XI, Orsay CEDEX, France 91405;LRI, Université Paris-Sud XI, Orsay CEDEX, France 91405

  • Venue:
  • TAMC '09 Proceedings of the 6th Annual Conference on Theory and Applications of Models of Computation
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce a method based on Kolmogorov complexity to prove lower bounds on communication complexity. The intuition behind our technique is close to information theoretic methods [1,2]. Our goal is to gain a better understanding of how information theoretic techniques differ from the family of techniques that follow from Linial and Shraibman's work on factorization norms [3]. This family extends to quantum communication, which prevents them from being used to prove a gap with the randomized setting. We use Kolmogorov complexity for three different things: first, to give a general lower bound in terms of Kolmogorov mutual information; second, to prove an alternative to Yao's minmax principle based on Kolmogorov complexity; and finally, to identify worst case inputs. We show that our method implies the rectangle and corruption bounds [4], known to be closely related to the subdistribution bound [2]. We apply our method to the hidden matching problem, a relation introduced to prove an exponential gap between quantum and classical communication [5]. We then show that our method generalizes the VC dimension [6] and shatter coefficient lower bounds [7]. Finally, we compare one-way communication and simultaneous communication in the case of distributional communication complexity and improve the previous known result [7].