Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
An introduction to computational learning theory
An introduction to computational learning theory
Journal of Computer and System Sciences
How many queries are needed to learn?
Journal of the ACM (JACM)
The Consistency Dimension and Distribution-Dependent Learning from Queries (Extended Abstract)
ALT '99 Proceedings of the 10th International Conference on Algorithmic Learning Theory
Borel sets and circuit complexity
STOC '83 Proceedings of the fifteenth annual ACM symposium on Theory of computing
Learnability and Definability in Trees and Similar Structures
STACS '02 Proceedings of the 19th Annual Symposium on Theoretical Aspects of Computer Science
Hi-index | 0.00 |
The consistency dimension, in several variants, is a recently introduced parameter useful for the study of polynomial query learning models. It characterizes those representation classes that are learnable in the corresponding models. By selecting an abstract enough concept of representation class, we formalize the intuitions that these dimensions relate to compactness issues, both in Logic and in a specific topological space. Thus, we are lead to the introduction of Quantitative Compactness notions, which simultaneously have a clear topological meaning and still characterize polynomial query learnable representation classes of boolean functions. They might have relevance elsewhere too. Their study is still ongoing, so that this paper is in a sense visionary, and might be flawed.