Class-based n-gram models of natural language
Computational Linguistics
Natural language parsing as statistical pattern recognition
Natural language parsing as statistical pattern recognition
A maximum entropy approach to natural language processing
Computational Linguistics
Inducing Features of Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
Spoken language parsing using phrase-level grammars and trainable classifiers
S2S '02 Proceedings of the ACL-02 workshop on Speech-to-speech translation: algorithms and systems - Volume 7
Tied-mixture language modeling in continuous space
NAACL '09 Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics
IBM MASTOR system: multilingual automatic speech-to-speech translator
MST '06 Proceedings of the Workshop on Medical Speech Translation
The use of artificial neural networks in the speech understanding model-SUM
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Hi-index | 0.00 |
Although syntactic structure has been used in recent work in language modeling, there has not been much effort in using semantic analysis for language models. In this study, we propose three new language modeling techniques that use semantic analysis for spoken dialog systems. We call these methods concept sequence modeling, two-level semantic-lexical modeling, and joint semantic-lexical modeling. These models combine lexical information with varying amounts of semantic information, using annotation supplied by either a shallow semantic parser or full hierarchical parser. These models also differ in how the lexical and semantic information is combined, ranging from simple interpolation to tight integration using maximum entropy modeling. We obtain improvements in recognition accuracy over word and class N-gram language models in three different task domains. Interpolation of the proposed models with class N-gram language models provides additional improvement in the air travel reservation domain. We show that as we increase the semantic information utilized and as we increase the tightness of integration between lexical and semantic items, we obtain improved performance when interpolating with class language models, indicating that the two types of models become more complementary in nature.