Learning to Parse Natural Language with Maximum Entropy Models
Machine Learning - Special issue on natural language learning
Head-driven statistical models for natural language parsing
Head-driven statistical models for natural language parsing
Statistical decision-tree models for parsing
ACL '95 Proceedings of the 33rd annual meeting on Association for Computational Linguistics
Recovering latent information in treebanks
COLING '02 Proceedings of the 19th international conference on Computational linguistics - Volume 1
Is it harder to parse Chinese, or the Chinese Treebank?
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Two statistical parsing models applied to the Chinese Treebank
CLPW '00 Proceedings of the second workshop on Chinese language processing: held in conjunction with the 38th Annual Meeting of the Association for Computational Linguistics - Volume 12
Statistical parsing with a context-free grammar and word statistics
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
Parsing the penn chinese treebank with semantic knowledge
IJCNLP'05 Proceedings of the Second international joint conference on Natural Language Processing
Event argument extraction based on CRF
CLSW'12 Proceedings of the 13th Chinese conference on Chinese Lexical Semantics
A chinese sentence segmentation approach based on comma
CLSW'12 Proceedings of the 13th Chinese conference on Chinese Lexical Semantics
Hi-index | 0.00 |
This paper proposes a hierarchical model to parse both English and Chinese sentences. This is done by iteratively constructing simple constituents first, so that complex ones could be detected reliably with richer contextual information in the following processes. Evaluation on the Penn WSJ Treebank and the Penn Chinese Treebank using maximum entropy models shows that our method can achieve a good performance with more flexibility for future improvement.