Communications of the ACM
Learning regular sets from queries and counterexamples
Information and Computation
Prediction-preserving reducibility
Journal of Computer and System Sciences - 3rd Annual Conference on Structure in Complexity Theory, June 14–17, 1988
The minimum consistent DFA problem cannot be approximated within any polynomial
Journal of the ACM (JACM)
Cryptographic limitations on learning Boolean formulae and finite automata
Journal of the ACM (JACM)
Randomly Fallible Teachers: Learning Monotone DNF with an Incomplete Membership Oracle
Machine Learning - Special issue on computational learning theory
An introduction to computational learning theory
An introduction to computational learning theory
Mathematical aspects of natural and formal languages
Mathematical aspects of natural and formal languages
A stochastic finite-state word-segmentation algorithm for Chinese
Computational Linguistics
VC-dimensions of finite automata and commutative finite automata with k letters and n states
Discrete Applied Mathematics
Exact Learning of Formulas in Parallel
Machine Learning
On the learnability and usage of acyclic probabilistic finite automata
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
Inference of Reversible Languages
Journal of the ACM (JACM)
Learning DFA from Simple Examples
Machine Learning
Proceedings of the 2nd GI Conference on Automata Theory and Formal Languages
Learning monotone dnf from a teacher that almost does not answer membership queries
The Journal of Machine Learning Research
Finite-state transducers in language and speech processing
Computational Linguistics
On some applications of finite-state automata theory to natural language processing
Natural Language Engineering
PAC-learnability of Probabilistic Deterministic Finite State Automata
The Journal of Machine Learning Research
On learning monotone DNF under product distributions
Information and Computation
Creating a finite-state parser with application semantics
COLING '02 Proceedings of the 19th international conference on Computational linguistics - Volume 2
Exploring learnability between exact and PAC
Journal of Computer and System Sciences - Special issue on COLT 2002
PAC-learnability of probabilistic deterministic finite state automata in terms of variation distance
Theoretical Computer Science
Kernel methods for learning languages
Theoretical Computer Science
Universal Kernel-Based Learning with Applications to Regular Languages
The Journal of Machine Learning Research
Two-level model for morphological analysis
IJCAI'83 Proceedings of the Eighth international joint conference on Artificial intelligence - Volume 2
A bibliographical study of grammatical inference
Pattern Recognition
Learning languages with rational kernels
COLT'07 Proceedings of the 20th annual conference on Learning theory
Efficient and robust music identification with weighted finite-state transducers
IEEE Transactions on Audio, Speech, and Language Processing
Discrete Applied Mathematics
Learning linearly separable languages
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
On the learnability of shuffle ideals
The Journal of Machine Learning Research
Hi-index | 0.00 |
Although PAC learning unrestricted regular languages is long known to be a very difficult problem, one might suppose the existence (and even an abundance) of natural efficiently learnable sub-families. When our literature search for a natural efficiently learnable regular family came up empty, we proposed the shuffle ideals as a prime candidate. A shuffle ideal generated by a string u is simply the collection of all strings containing u as a (discontiguous) subsequence. This fundamental language family is of theoretical interest in its own right and also provides the building blocks for other important language families. Somewhat surprisingly, we discovered that even a class as simple as the shuffle ideals is not properly PAC learnable, unless RP=NP. In the positive direction, we give an efficient algorithm for properly learning shuffle ideals in the statistical query (and therefore also PAC) model under the uniform distribution.