Communications of the ACM
Identification of pattern languages from examples and queries
Information and Computation
Information Processing Letters
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Pattern languages are not learnable
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Learning pattern languages from a single initial example and from queries
COLT '88 Proceedings of the first annual workshop on Computational learning theory
A polynomial-time algorithm for learning k-variable pattern languages from examples
COLT '89 Proceedings of the second annual workshop on Computational learning theory
Learning string patterns and tree patterns from examples
Proceedings of the seventh international conference (1990) on Machine learning
Toward efficient agnostic learning
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Learnability of a subclass of extended pattern languages
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Learning one-variable pattern languages in linear average time
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
ICGI '98 Proceedings of the 4th International Colloquium on Grammatical Inference
Polynomial Time Inference of Extended Regular Pattern Languages
Proceedings of RIMS Symposium on Software Science and Engineering
Algorithmic Learning for Knowledge-Based Systems, GOSLER Final Report
ALT '97 Proceedings of the 8th International Conference on Algorithmic Learning Theory
Monotonic Versus Nonmonotonic Language Learning
Proceedings of the Second International Workshop on Nonmonotonic and Inductive Logic
SAGA '01 Proceedings of the International Symposium on Stochastic Algorithms: Foundations and Applications
Hi-index | 0.00 |
This paper derives the Vapnik Chervonenkis dimension of several natural subclasses of pattern languages. For classes with unbounded VC-dimension, an attempt is made to quantify the "rate of growth" of VC-dimension for these classes. This is achieved by computing, for each n, size of the "smallest" witness set of n elements that is shattered by the class. The paper considers both erasing (empty substitutions allowed) and nonerasing (empty substitutions not allowed) pattern languages. For erasing pattern languages, optimal bounds for this size --within polynomial order -- are derived for the case of 1 variable occurrence and unary alphabet, for the case where the number of variable occurrences is bounded by a constant, and the general case of all pattern languages. The extent to which these results hold for nonerasing pattern languages is also investigated. Some results that shed light on efficient learning of subclasses of pattern languages are also given.