Topics in matrix analysis
The Johnson-Lindenstrauss Lemma and the sphericity of some graphs
Journal of Combinatorial Theory Series A
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Relations Between Communication Complexity, Linear Arrangements, and Computational Complexity
FST TCS '01 Proceedings of the 21st Conference on Foundations of Software Technology and Theoretical Computer Science
Limitations of Learning via Embeddings in Euclidean Half-Spaces
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Estimating the Optimal Margins of Embeddings in Euclidean Half Spaces
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
FOCS '95 Proceedings of the 36th Annual Symposium on Foundations of Computer Science
An Algorithmic Theory of Learning: Robust Concepts and Random Projection
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
A Linear Lower Bound on the Unbounded Error Probabilistic Communication Complexity
CCC '01 Proceedings of the 16th Annual Conference on Computational Complexity
On the Representation of Boolean Predicates of the Diffie-Hellman Function
STACS '03 Proceedings of the 20th Annual Symposium on Theoretical Aspects of Computer Science
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.00 |
This paper discusses theoretical limitations of classification systems that are based on feature maps and use a separating hyperplane in the feature space. In particular, we study the embeddability of a given concept class into a class of Euclidean half spaces of low dimension, or of arbitrarily large dimension but realizing a large margin. New bounds on the smallest possible dimension or on the largest possible margin are presented. In addition, we present new results on the rigidity of matrices and briefly mention applications in complexity and learning theory.