A maximum entropy approach to natural language processing
Computational Linguistics
Head-driven statistical models for natural language parsing
Head-driven statistical models for natural language parsing
Robust probabilistic predictive syntactic processing: motivations, models, and applications
Robust probabilistic predictive syntactic processing: motivations, models, and applications
PCFG models of linguistic tree representations
Computational Linguistics
A maximum-entropy-inspired parser
NAACL 2000 Proceedings of the 1st North American chapter of the Association for Computational Linguistics conference
Immediate-head parsing for language models
ACL '01 Proceedings of the 39th Annual Meeting on Association for Computational Linguistics
Introduction to the special issue on statistical language modeling
ACM Transactions on Asian Language Information Processing (TALIP)
A Neural Syntactic Language Model
Machine Learning
Training connectionist models for the structured language model
EMNLP '03 Proceedings of the 2003 conference on Empirical methods in natural language processing
Attention shifting for parsing speech
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Head-driven parsing for word lattices
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Discriminative syntactic language modeling for speech recognition
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
ACM Transactions on Asian Language Information Processing (TALIP)
A look at parsing and its applications
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
Lattice parsing to integrate speech recognition and rule-based machine translation
EACL '09 Proceedings of the 12th Conference of the European Chapter of the Association for Computational Linguistics
A statistical constraint dependency grammar (CDG) parser
IncrementParsing '04 Proceedings of the Workshop on Incremental Parsing: Bringing Engineering and Cognition Together
String-to-dependency statistical machine translation
Computational Linguistics
Syntactic language modeling with formal grammars
Speech Communication
Large-scale syntactic language modeling with treelets
ACL '12 Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Long Papers - Volume 1
Hi-index | 0.00 |
We study the impact of richer syntactic dependencies on the performance of the structured language model (SLM) along three dimensions: parsing accuracy (LP/LR), perplexity (PPL) and word-error-rate (WER, N-best re-scoring). We show that our models achieve an improvement in LP/LR, PPL and/or WER over the reported baseline results using the SLM on the UPenn Treebank and Wall Street Journal (WSJ) corpora, respectively. Analysis of parsing performance shows correlation between the quality of the parser (as measured by precision/recall) and the language model performance (PPL and WER). A remarkable fact is that the enriched SLM outperforms the baseline 3-gram model in terms of WER by 10% when used in isolation as a second pass (N-best re-scoring) language model.