Inference of Reversible Languages
Journal of the ACM (JACM)
Theory of Syntactic Recognition for Natural Languages
Theory of Syntactic Recognition for Natural Languages
The Theory of Parsing, Translation, and Compiling
The Theory of Parsing, Translation, and Compiling
Computational analogues of constraints on grammars: a model of syntactic acquisition
ACL '80 Proceedings of the 18th annual meeting on Association for Computational Linguistics
On formalizations of Marcus' parser
COLING '86 Proceedings of the 11th coference on Computational linguistics
A new kind of finite-state automaton: register vector grammar
IJCAI'85 Proceedings of the 9th international joint conference on Artificial intelligence - Volume 2
Hi-index | 0.00 |
Natural languages are often assumed to be constrained so that they are either easily learnable or parsable, but few studies have investigated the connection between these two "functional" demands. Without a formal model of parsability or learnability, it is difficult to determine which is more "dominant" in fixing the properties of natural languages. In this paper we show that if we adopt one precise model of "easy" parsability, namely, that of bounded context parsability, and a precise model of "easy" learnability, namely, that of degree 2 learnability, then we can show that certain families of grammars that meet the bounded context parsability condition will also be degree 2 learnable. Some implications of this result for learning in other subsystems of linguistic knowledge are suggested.1