Bounded context parsing and easy learnability

  • Authors:
  • Robert C. Berwick

  • Affiliations:
  • MIT Artificial Intelligence Lab, Cambridge, MA

  • Venue:
  • ACL '84 Proceedings of the 10th International Conference on Computational Linguistics and 22nd annual meeting on Association for Computational Linguistics
  • Year:
  • 1984

Quantified Score

Hi-index 0.00

Visualization

Abstract

Natural languages are often assumed to be constrained so that they are either easily learnable or parsable, but few studies have investigated the connection between these two "functional" demands. Without a formal model of parsability or learnability, it is difficult to determine which is more "dominant" in fixing the properties of natural languages. In this paper we show that if we adopt one precise model of "easy" parsability, namely, that of bounded context parsability, and a precise model of "easy" learnability, namely, that of degree 2 learnability, then we can show that certain families of grammars that meet the bounded context parsability condition will also be degree 2 learnable. Some implications of this result for learning in other subsystems of linguistic knowledge are suggested.1