Head-driven statistical models for natural language parsing
Head-driven statistical models for natural language parsing
Applied morphological processing of English
Natural Language Engineering
Statistical models for unsupervised prepositional phrase attachment
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 2
Structural ambiguity and lexical relations
ACL '91 Proceedings of the 29th annual meeting on Association for Computational Linguistics
A rule-based approach to prepositional phrase attachment disambiguation
COLING '94 Proceedings of the 15th conference on Computational linguistics - Volume 2
Learning random walk models for inducing word dependency distributions
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Combining unsupervised and supervised methods for PP attachment disambiguation
COLING '02 Proceedings of the 19th international conference on Computational linguistics - Volume 1
An unsupervised approach to prepositional phrase attachment using contextually similar words
ACL '00 Proceedings of the 38th Annual Meeting on Association for Computational Linguistics
A maximum entropy model for prepositional phrase attachment
HLT '94 Proceedings of the workshop on Human Language Technology
EMNLP '02 Proceedings of the ACL-02 conference on Empirical methods in natural language processing - Volume 10
Logarithmic opinion pools for conditional random fields
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Prepositional phrase attachment without oracles
Computational Linguistics
Online EM for unsupervised models
NAACL '09 Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Products of random latent variable grammars
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Posterior Regularization for Structured Latent Variable Models
The Journal of Machine Learning Research
Hi-index | 0.00 |
Prepositional phrase attachment is an important subproblem of parsing, performance on which suffers from limited availability of labelled data. We present a semi-supervised approach. We show that a discriminative lexical model trained from labelled data, and a generative lexical model learned via Expectation Maximization from unlabelled data can be combined in a product model to yield a PP-attachment model which is better than either is alone, and which outperforms the modern parser of Petrov and Klein (2007) by a significant margin. We show that, when learning from unlabelled data, it can be beneficial to model the generation of modifiers of a head collectively, rather than individually. Finally, we suggest that our pair of models will be interesting to combine using new techniques for discriminatively constraining EM.