Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Statistical methods for speech recognition
Statistical methods for speech recognition
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
A neural probabilistic language model
The Journal of Machine Learning Research
Continuous space language models
Computer Speech and Language
Efficient BP Algorithms for General Feedforward Neural Networks
IWINAC '07 Proceedings of the 2nd international work-conference on The Interplay Between Natural and Artificial Computation, Part I: Bio-inspired Modeling of Cognitive Tasks
New directions in connectionist language modeling
IWANN'03 Proceedings of the Artificial and natural neural networks 7th international conference on Computational methods in neural modeling - Volume 1
UCH-UPV English: Spanish system for WMT10
WMT '10 Proceedings of the Joint Fifth Workshop on Statistical Machine Translation and MetricsMATR
CEU-UPV English-Spanish system for WMT11
WMT '11 Proceedings of the Sixth Workshop on Statistical Machine Translation
Neural network language models for off-line handwriting recognition
Pattern Recognition
Hi-index | 0.00 |
Connectionist language models offer many advantages over their statistical counterparts, but they also have some drawbacks like a much more expensive computational cost. This paper describes a novel method to overcome this problem. A set of normalization values associated to the most frequent n -gramsis pre-computed and the model is smoothed with lower n -gramconnectionist or statistical models. The proposed approach is favourably compared to standard connectionist language models and with statistical back-off language models.