Large margin classification using the perceptron algorithm
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Learning to Parse Natural Language with Maximum Entropy Models
Machine Learning - Special issue on natural language learning
A new discriminative kernel from probabilistic models
Neural Computation
Discriminative Reranking for Natural Language Parsing
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Head-driven statistical models for natural language parsing
Head-driven statistical models for natural language parsing
Building a large annotated corpus of English: the penn treebank
Computational Linguistics - Special issue on using large corpora: II
A maximum-entropy-inspired parser
NAACL 2000 Proceedings of the 1st North American chapter of the Association for Computational Linguistics conference
More accurate tests for the statistical significance of result differences
COLING '00 Proceedings of the 18th conference on Computational linguistics - Volume 2
Support vector machine learning for interdependent and structured output spaces
ICML '04 Proceedings of the twenty-first international conference on Machine learning
An efficient implementation of a new DOP model
EACL '03 Proceedings of the tenth conference on European chapter of the Association for Computational Linguistics - Volume 1
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Inducing history representations for broad coverage statistical parsing
NAACL '03 Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology - Volume 1
An SVM based voting algorithm with application to parse reranking
CONLL '03 Proceedings of the seventh conference on Natural language learning at HLT-NAACL 2003 - Volume 4
Using LTAG based features in parse reranking
EMNLP '03 Proceedings of the 2003 conference on Empirical methods in natural language processing
Discriminative training of a neural network statistical parser
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Incremental parsing with the perceptron algorithm
ACL '04 Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics
Flexible margin selection for reranking with full pairwise samples
IJCNLP'04 Proceedings of the First international joint conference on Natural Language Processing
Japanese dependency parsing using co-occurrence information and a combination of case elements
ACL-44 Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics
Searching for Part of Speech Tags That Improve Parsing Models
GoTAL '08 Proceedings of the 6th international conference on Advances in Natural Language Processing
Porting statistical parsers with data-defined kernels
CoNLL-X '06 Proceedings of the Tenth Conference on Computational Natural Language Learning
Loss minimization in parse reranking
EMNLP '06 Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing
A latent variable model for generative dependency parsing
IWPT '07 Proceedings of the 10th International Conference on Parsing Technologies
Re-ranking algorithms for name tagging
CHSLP '06 Proceedings of the Workshop on Computationally Hard Problems and Joint Inference in Speech and Language Processing
Learning with annotation noise
ACL '09 Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: Volume 1 - Volume 1
Adding smarter systems instead of human annotators: re-ranking for system combination
Proceedings of the 1st international workshop on Search and mining entity-relationship data
Hi-index | 0.00 |
Previous research applying kernel methods to natural language parsing have focussed on proposing kernels over parse trees, which are hand-crafted based on domain knowledge and computational considerations. In this paper we propose a method for defining kernels in terms of a probabilistic model of parsing. This model is then trained, so that the parameters of the probabilistic model reflect the generalizations in the training data. The method we propose then uses these trained parameters to define a kernel for reranking parse trees. In experiments, we use a neural network based statistical parser as the probabilistic model, and use the resulting kernel with the Voted Perceptron algorithm to rerank the top 20 parses from the probabilistic model. This method achieves a significant improvement over the accuracy of the probabilistic model.