Perception as Bayesian inference
Perception as Bayesian inference
Dynamic bayesian networks: representation, inference and learning
Dynamic bayesian networks: representation, inference and learning
Building a large annotated corpus of English: the penn treebank
Computational Linguistics - Special issue on using large corpora: II
Bayesian nets in syntactic categorization of novel words
NAACL-Short '03 Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology: companion volume of the Proceedings of HLT-NAACL 2003--short papers - Volume 2
Corpus-based induction of syntactic structure: models of dependency and constituency
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Incremental Bayesian networks for structure prediction
Proceedings of the 24th international conference on Machine learning
A latent variable model for generative dependency parsing
IWPT '07 Proceedings of the 10th International Conference on Parsing Technologies
Incremental Sigmoid Belief Networks for Grammar Learning
The Journal of Machine Learning Research
Identification of multi-word expressions by combining multiple linguistic information sources
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Hi-index | 0.00 |
Exact parsing with finite state automata is deemed in-apropriate because of the unbounded non-locality languages overwhelmingly exhibit. We propose a way to structure the parsing task in order to make it amenable to local classification methods. This allows us to build a Dynamic Bayesian Network which uncovers the syntactic dependency structure of English sentences. Experiments with the Wall Street Journal demonstrate that the model successfully learns from labeled data.