Learning context-free grammars from structural data in polynomial time
Theoretical Computer Science
A dynamic language model for speech recognition
HLT '91 Proceedings of the workshop on Speech and Natural Language
A comparison of indexing techniques for Japanese text retrieval
SIGIR '93 Proceedings of the 16th annual international ACM SIGIR conference on Research and development in information retrieval
A maximum entropy approach to natural language processing
Computational Linguistics
Exact sampling with coupled Markov chains and applications to statistical mechanics
Proceedings of the seventh international conference on Random structures and algorithms
Statistical methods for speech recognition
Statistical methods for speech recognition
Foundations of statistical natural language processing
Foundations of statistical natural language processing
The String-to-String Correction Problem
Journal of the ACM (JACM)
Inducing Features of Random Fields
Inducing Features of Random Fields
Maximum entropy models for natural language ambiguity resolution
Maximum entropy models for natural language ambiguity resolution
A maximum entropy approach to named entity recognition
A maximum entropy approach to named entity recognition
Building a large annotated corpus of English: the penn treebank
Computational Linguistics - Special issue on using large corpora: II
Inside-outside reestimation from partially bracketed corpora
ACL '92 Proceedings of the 30th annual meeting on Association for Computational Linguistics
The effects of word order and segmentation on translation retrieval performance
COLING '00 Proceedings of the 18th conference on Computational linguistics - Volume 1
Combination of n-grams and Stochastic Context-Free Grammars for language modeling
COLING '00 Proceedings of the 18th conference on Computational linguistics - Volume 1
Toward memory-based translation
COLING '90 Proceedings of the 13th conference on Computational linguistics - Volume 3
CTM: an example-based translation aid system
COLING '92 Proceedings of the 14th conference on Computational linguistics - Volume 4
The SMART Retrieval System—Experiments in Automatic Document Processing
The SMART Retrieval System—Experiments in Automatic Document Processing
Using perfect sampling in parameter estimation of a whole sentence maximum entropy language model
ConLL '00 Proceedings of the 2nd workshop on Learning language in logic and the 4th conference on Computational natural language learning - Volume 7
Efficient sampling and feature selection in whole sentence maximum entropy language models
ICASSP '99 Proceedings of the Acoustics, Speech, and Signal Processing, 1999. on 1999 IEEE International Conference - Volume 01
ANERsys: An Arabic Named Entity Recognition System Based on Maximum Entropy
CICLing '07 Proceedings of the 8th International Conference on Computational Linguistics and Intelligent Text Processing
Refining generative language models using discriminative learning
EMNLP '08 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Proceedings of the 2010 conference on Human Language Technologies -- The Baltic Perspective: Proceedings of the Fourth International Conference Baltic HLT 2010
Hi-index | 0.00 |
In this paper, we propose adding long-term grammatical information in a Whole Sentence Maximun Entropy Language Model (WSME) in order to improve the performance of the model. The grammatical information was added to the WSME model as features and were obtained from a Stochastic Context-Free grammar. Finally, experiments using a part of the Penn Treebank corpus were carried out and significant improvements were acheived.