Probabilistic communication complexity
Journal of Computer and System Sciences
Some combinatorial-algebraic problems from complexity theory
Discrete Mathematics - Special issue: trends in discrete mathematics
The nature of statistical learning theory
The nature of statistical learning theory
On randomized one-round communication complexity
STOC '95 Proceedings of the twenty-seventh annual ACM symposium on Theory of computing
Handbook of combinatorics (vol. 2)
Communication complexity
Information Processing Letters
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
The discrepancy method: randomness and complexity
The discrepancy method: randomness and complexity
Lectures on Discrete Geometry
FOCS '95 Proceedings of the 36th Annual Symposium on Foundations of Computer Science
On notions of information transfer in VLSI circuits
STOC '83 Proceedings of the fifteenth annual ACM symposium on Theory of computing
A Linear Lower Bound on the Unbounded Error Probabilistic Communication Complexity
CCC '01 Proceedings of the 16th Annual Conference on Computational Complexity
Limitations of learning via embeddings in euclidean half spaces
The Journal of Machine Learning Research
Approximating the cut-norm via Grothendieck's inequality
STOC '04 Proceedings of the thirty-sixth annual ACM symposium on Theory of computing
Embedding with a Lipschitz function
Random Structures & Algorithms
Theoretical Computer Science - Algorithmic learning theory(ALT 2002)
Lower bounds in communication complexity based on factorization norms
Proceedings of the thirty-ninth annual ACM symposium on Theory of computing
Communication Complexity under Product and Nonproduct Distributions
CCC '08 Proceedings of the 2008 IEEE 23rd Annual Conference on Computational Complexity
A Direct Product Theorem for Discrepancy
CCC '08 Proceedings of the 2008 IEEE 23rd Annual Conference on Computational Complexity
On determinism versus non-determinism and related problems
SFCS '83 Proceedings of the 24th Annual Symposium on Foundations of Computer Science
Geometrical realization of set systems and probabilistic communication complexity
SFCS '85 Proceedings of the 26th Annual Symposium on Foundations of Computer Science
Complexity classes in communication complexity theory
SFCS '86 Proceedings of the 27th Annual Symposium on Foundations of Computer Science
Complexity measures of sign matrices
Combinatorica
On the limitations of embedding methods
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Computational Complexity
Unbounded-error classical and quantum communication complexity
ISAAC'07 Proceedings of the 18th international conference on Algorithms and computation
Composition theorems in communication complexity
ICALP'10 Proceedings of the 37th international colloquium conference on Automata, languages and programming
SIAM Journal on Computing
A lower bound on entanglement-assisted quantum communication complexity
ICALP'07 Proceedings of the 34th international conference on Automata, Languages and Programming
Hi-index | 0.00 |
This paper has two main focal points. We first consider an important class of machine learning algorithms: large margin classifiers, such as Support Vector Machines. The notion of margin complexity quantifies the extent to which a given class of functions can be learned by large margin classifiers. We prove that up to a small multiplicative constant, margin complexity is equal to the inverse of discrepancy. This establishes a strong tie between seemingly very different notions from two distinct areas. In the same way that matrix rigidity is related to rank, we introduce the notion of rigidity of margin complexity. We prove that sign matrices with small margin complexity rigidity are very rare. This leads to the question of proving lower bounds on the rigidity of margin complexity. Quite surprisingly, this question turns out to be closely related to basic open problems in communication complexity, e.g., whether PSPACE can be separated from the polynomial hierarchy in communication complexity. Communication is a key ingredient in many types of learning. This explains the relations between the field of learning theory and that of communication complexity [6, l0, 16, 26]. The results of this paper constitute another link in this rich web of relations. These new results have already been applied toward the solution of several open problems in communication complexity [18, 20, 29].