Neural Networks - 2005 Special issue: IJCNN 2005
Learning to Forget: Continual Prediction with LSTM
Neural Computation
Neural Computation
An application of recurrent neural networks to discriminative keyword spotting
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Computer Speech and Language
Tandem connectionist feature extraction for conversational speech recognition
MLMI'04 Proceedings of the First international conference on Machine Learning for Multimodal Interaction
Bidirectional recurrent neural networks
IEEE Transactions on Signal Processing
Online Driver Distraction Detection Using Long Short-Term Memory
IEEE Transactions on Intelligent Transportation Systems
Computer Speech and Language
Hi-index | 0.00 |
This paper introduces a novel context-sensitive feature extraction approach for spontaneous speech recognition. As bidirectional Long Short-Term Memory (BLSTM) networks are known to enable improved phoneme recognition accuracies by incorporating long-range contextual information into speech decoding, we integrate the BLSTM principle into a Tandem front-end for probabilistic feature extraction. Unlike previously proposed approaches which exploit BLSTM modeling by generating a discrete phoneme prediction feature, our feature extractor merges continuous high-level probabilistic BLSTM features with low-level features. Evaluations on challenging spontaneous, conversational speech recognition tasks show that this concept prevails over recently published architectures for feature-level context modeling.