Large Margin Classification Using the Perceptron Algorithm
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Exploiting generative models in discriminative classifiers
Proceedings of the 1998 conference on Advances in neural information processing systems II
What is the minimal set of fragments that achieves maximal parse accuracy?
ACL '01 Proceedings of the 39th Annual Meeting on Association for Computational Linguistics
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Journal of Artificial Intelligence Research
A general framework for adaptive processing of data structures
IEEE Transactions on Neural Networks
Ambiguity resolution analysis in incremental parsing of natural language
IEEE Transactions on Neural Networks
Theoretical Computer Science
Neural network for graphs: a contextual constructive approach
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Convolution kernels and recursive neural networks are both suitable approaches for supervised learning when the input is a discrete structure like a labeled tree or graph. We compare these techniques in two natural language problems. In both problems, the learning task consists in choosing the best alternative tree in a set of candidates. We report about an empirical evaluation between the two methods on a large corpus of parsed sentences and speculate on the role played by the representation and the loss function.