Learning to Parse Natural Language with Maximum Entropy Models
Machine Learning - Special issue on natural language learning
Training products of experts by minimizing contrastive divergence
Neural Computation
An Information-Theoretic Definition of Similarity
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
A neural probabilistic language model
The Journal of Machine Learning Research
A fast learning algorithm for deep belief nets
Neural Computation
Non-projective dependency parsing using spanning tree algorithms
HLT '05 Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing
Three new graphical models for statistical language modelling
Proceedings of the 24th international conference on Machine learning
Restricted Boltzmann machines for collaborative filtering
Proceedings of the 24th international conference on Machine learning
A unified architecture for natural language processing: deep neural networks with multitask learning
Proceedings of the 25th international conference on Machine learning
Labeled pseudo-projective dependency parsing with support vector machines
CoNLL-X '06 Proceedings of the Tenth Conference on Computational Natural Language Learning
The CoNLL-2008 shared task on joint parsing of syntactic and semantic dependencies
CoNLL '08 Proceedings of the Twelfth Conference on Computational Natural Language Learning
A latent variable model of synchronous parsing for syntactic and semantic dependencies
CoNLL '08 Proceedings of the Twelfth Conference on Computational Natural Language Learning
Dependency-based syntactic-semantic analysis with PropBank and NomBank
CoNLL '08 Proceedings of the Twelfth Conference on Computational Natural Language Learning
The CoNLL-2009 shared task: syntactic and semantic dependencies in multiple languages
CoNLL '09 Proceedings of the Thirteenth Conference on Computational Natural Language Learning: Shared Task
Efficient parsing of syntactic and semantic dependency structures
CoNLL '09 Proceedings of the Thirteenth Conference on Computational Natural Language Learning: Shared Task
Ensemble models for dependency parsing: cheap and good?
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Bayesian network automata for modelling unbounded structures
IWPT '11 Proceedings of the 12th International Conference on Parsing Technologies
Perception processing for general intelligence: bridging the symbolic/subsymbolic gap
AGI'12 Proceedings of the 5th international conference on Artificial General Intelligence
Multilingual joint parsing of syntactic and semantic dependencies with a latent variable model
Computational Linguistics
Hi-index | 0.00 |
We propose a generative model based on Temporal Restricted Boltzmann Machines for transition based dependency parsing. The parse tree is built incrementally using a shift-reduce parse and an RBM is used to model each decision step. The RBM at the current time step induces latent features with the help of temporal connections to the relevant previous steps which provide context information. Our parser achieves labeled and unlabeled attachment scores of 88.72% and 91.65% respectively, which compare well with similar previous models and the state-of-the-art.