Head-driven statistical models for natural language parsing
Head-driven statistical models for natural language parsing
PCFG models of linguistic tree representations
Computational Linguistics
A maximum-entropy-inspired parser
NAACL 2000 Proceedings of the 1st North American chapter of the Association for Computational Linguistics conference
Compacting the Penn Treebank grammar
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 1
A statistical parser for Czech
ACL '99 Proceedings of the 37th annual meeting of the Association for Computational Linguistics on Computational Linguistics
Converting dependency structures to phrase structures
HLT '01 Proceedings of the first international conference on Human language technology research
Recovering latent information in treebanks
COLING '02 Proceedings of the 19th international conference on Computational linguistics - Volume 1
Accurate unlexicalized parsing
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Is it harder to parse Chinese, or the Chinese Treebank?
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
On the parameter space of generative lexicalized statistical parsing models
On the parameter space of generative lexicalized statistical parsing models
Intricacies of Collins' Parsing Model
Computational Linguistics
Two statistical parsing models applied to the Chinese Treebank
CLPW '00 Proceedings of the second workshop on Chinese language processing: held in conjunction with the 38th Annual Meeting of the Association for Computational Linguistics - Volume 12
Probabilistic CFG with latent annotations
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Coarse-to-fine n-best parsing and MaxEnt discriminative reranking
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Learning accurate, compact, and interpretable tree annotation
ACL-44 Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics
Deterministic dependency parsing of English text
COLING '04 Proceedings of the 20th international conference on Computational Linguistics
Effective self-training for parsing
HLT-NAACL '06 Proceedings of the main conference on Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics
TAG, dynamic programming, and the perceptron for efficient, feature-rich parsing
CoNLL '08 Proceedings of the Twelfth Conference on Computational Natural Language Learning
Self-training PCFG grammars with latent annotations across languages
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 2 - Volume 2
Statistical parsing with a context-free grammar and word statistics
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
Lexicalized beam thresholding parsing with prior and boundary estimates
CICLing'05 Proceedings of the 6th international conference on Computational Linguistics and Intelligent Text Processing
Features for phrase-structure reranking from dependency parses
IWPT '11 Proceedings of the 12th International Conference on Parsing Technologies
NAACL HLT '12 Proceedings of the 2012 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Hi-index | 0.00 |
In this paper we present a novel phrase structure parsing approach with the help of dependency structure. Different with existing phrase parsers, in our approach the inference procedure is guided by dependency structure, which makes the parsing procedure flexibly. The experimental results show our approach is much more accurate. With the help of golden dependency trees, F1 score of our parser achieves 96.08% on Penn English Treebank and 90.61% on Penn Chinese Treebank. With the help of N-best dependency trees generated by modified MSTParser, F1 score achieves 90.54% for English and 83.93% for Chinese.