Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
On learning in the limit and non-uniform (&egr;,&dgr;)-learning
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On the role of procrastination in machine learning
Information and Computation
A note on batch and incremental learnability
Journal of Computer and System Sciences
Ordinal mind change complexity of language identification
Theoretical Computer Science
Inductive inference with procrastination: back to definitions
Fundamenta Informaticae
Learning, Logic, and Topology in a Common Framework
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
A General Theory of Deduction, Induction, and Learning
DS '01 Proceedings of the 4th International Conference on Discovery Science
A Note on VC-Dimension and Measure of Sets of Reals
Combinatorics, Probability and Computing
Hi-index | 0.00 |
We generalize the classical notion of Vapnik-Chernovenkis (VC) dimension to ordinal VC-dimension, in the context of logical learning paradigms. Logical learning paradigms encompass the numerical learning paradigms commonly studied in Inductive Inference. A logical learning paradigm is defined as a set W of structures over some vocabulary, and a set D of first-order formulas that represent data. The sets of models of ϕ in W, where ϕ varies over D, generate a natural topology W over W.We show that if D is closed under boolean operators, then the notion of ordinal VC-dimension offers a perfect characterization for the problem of predicting the truth of the members of D in a member of W, with an ordinal bound on the number of mistakes. This shows that the notion of VC-dimension has a natural interpretation in Inductive Inference, when cast into a logical setting. We also study the relationships between predictive complexity, selective complexity--a variation on predictive complexity--and mind change complexity. The assumptions that D is closed under boolean operators and that W is compact often play a crucial role to establish connections between these concepts.We then consider a computable setting with effective versions of the complexity measures, and show that the equivalence between ordinal VC-dimension and predictive complexity fails. More precisely, we prove that the effective ordinal VC-dimension of a paradigm can be defined when all other effective notions of complexity are undefined. On a better note, when W is compact, all effective notions of complexity are defined, though they are not related as in the noncomputable version of the framework.