Learning Complexity vs. Communication Complexity

  • Authors:
  • Nati Linial;Adi Shraibman

  • Affiliations:
  • -;-

  • Venue:
  • CCC '08 Proceedings of the 2008 IEEE 23rd Annual Conference on Computational Complexity
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper has two main focal points. We first consider an important class of machine learning algorithms - large margin classifiers, such as Support Vector Machines. The notion of {\emmargin complexity} quantifies the extent to which a given class of functions can be learned by large margin classifiers. We prove that up to a small multiplicative constant, margin complexity is equal to the inverse of discrepancy. This establishes a strong tie between seemingly very different notions from two distinct areas. In the same way that matrix rigidity is related to rank, we introduce the notion of rigidity of margin complexity. We prove that sign matrices with small margin complexity rigidity are very rare. This leads to the question of proving lower bounds on the rigidity of margin complexity. Quite surprisingly, this question turns out to be closely related to basic open problems in communication complexity, e.g., whether $PSPACE$ can be separated from the polynomial hierarchy in communication complexity. There are numerous known relations between the field of learning theory and that of communication complexity \cite{BES,Frel01,PS86,knd95}, as one might expect since communication is an inherent aspect of learning. The results of this paper constitute another link in this rich web of relations. This link has already proved significant as it was used in the solution of a few open problems in communication complexity \cite{ccfn, LSpS, Sher}.