Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
A Hierarchical Field Framework for Unified Context-Based Classification
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Neural Networks - 2005 Special issue: IJCNN 2005
ICML '06 Proceedings of the 23rd international conference on Machine learning
Identifying hierarchical structure in sequences: a linear-time algorithm
Journal of Artificial Intelligence Research
Bidirectional LSTM networks for improved phoneme classification and recognition
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Bidirectional recurrent neural networks
IEEE Transactions on Signal Processing
Anticipations, Brains, Individual and Social Behavior: An Introduction to Anticipatory Systems
Anticipatory Behavior in Adaptive Learning Systems
Spurious valleys in the error surface of recurrent networks: analysis and avoidance
IEEE Transactions on Neural Networks
An application of recurrent neural networks to discriminative keyword spotting
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Tandem decoding of children's speech for keyword detection in a child-robot interaction scenario
ACM Transactions on Speech and Language Processing (TSLP)
Hi-index | 0.00 |
Modelling data in structured domains requires establishing the relations among patterns at multiple scales. When these patterns arise from sequential data, the multiscale structure also contains a dynamic component that must be modelled, particularly, as is often the case, if the data is unsegmented. Probabilistic graphical models are the predominant framework for labelling unsegmented sequential data in structured domains. Their use requires a certain degree of a priori knowledge about the relations among patterns and about the patterns themselves. This paper presents a hierarchical system, based on the connectionist temporal classification algorithm, for labelling unsegmented sequential data at multiple scales with recurrent neural networks only. Experiments on the recognition of sequences of spoken digits show that the system outperforms hidden Markov models, while making fewer assumptions about the domain.