Matrix analysis
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Lower Bound Methods and Separation Results for On-Line Learning Models
Machine Learning - Computational learning theory
An introduction to computational learning theory
An introduction to computational learning theory
Geometric arguments yield better bounds for threshold circuits and distributed computing
Theoretical Computer Science
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
A Linear Lower Bound on the Unbounded Error Probabilistic Communication Complexity
CCC '01 Proceedings of the 16th Annual Conference on Computational Complexity
Computational Complexity
VC dimension and inner product space induced by Bayesian networks
International Journal of Approximate Reasoning
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.00 |
Concept classes can canonically be represented by matrices with entries 1 and −1. We use the singular value decomposition of this matrix to determine the optimal margins of embeddings of the concept classes of singletons and of half intervals in homogeneous Euclidean half spaces. For these concept classes the singular value decomposition can be used to construct optimal embeddings and also to prove the corresponding best possible upper bounds on the margin. We show that the optimal margin for embedding n singletons is \frac{n}{3n-4} and that the optimal margin for half intervals over {1,…,n} is \frac{\pi}{2 \ln n} + \Theta (\frac{1}{(\ln n)^2}). For the upper bounds on the margins we generalize a bound by Forster (2001). We also determine the optimal margin of some concept classes defined by circulant matrices up to a small constant factor, and we discuss the concept classes of monomials to point out limitations of our approach.