A vector space model for automatic indexing
Communications of the ACM
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Automatic word sense discrimination
Computational Linguistics - Special issue on word sense disambiguation
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Automatic retrieval and clustering of similar words
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 2
COLING '02 Proceedings of the 19th international conference on Computational linguistics - Volume 1
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Dependency-Based Construction of Semantic Space Models
Computational Linguistics
Dependency-based syntactic-semantic analysis with PropBank and NomBank
CoNLL '08 Proceedings of the Twelfth Conference on Computational Natural Language Learning
Fine-grained classification of named entities exploiting latent semantic kernels
CoNLL '09 Proceedings of the Thirteenth Conference on Computational Natural Language Learning
WordNet::Similarity: measuring the relatedness of concepts
HLT-NAACL--Demonstrations '04 Demonstration Papers at HLT-NAACL 2004
One distributional memory, many semantic spaces
GEMS '09 Proceedings of the Workshop on Geometrical Models of Natural Language Semantics
Combined syntactic and semantic Kernels for text classification
ECIR'07 Proceedings of the 29th European conference on IR research
Towards open-domain Semantic Role Labeling
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
From frequency to meaning: vector space models of semantics
Journal of Artificial Intelligence Research
Efficient convolution kernels for dependency and constituent syntactic trees
ECML'06 Proceedings of the 17th European conference on Machine Learning
Structured lexical similarity via convolution kernels on dependency trees
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Experimental support for a categorical compositional distributional model of meaning
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Hi-index | 0.00 |
The representation of word meaning in texts is a central problem in Computational Linguistics. Geometrical models represent lexical semantic information in terms of the basic co-occurrences that words establish each other in large-scale text collections. As recent works already address, the definition of methods able to express the meaning of phrases or sentences as operations on lexical representations is a complex problem, and a still largely open issue. In this paper, a perspective centered on Convolution Kernels is discussed and the formulation of a Partial Tree Kernel that integrates syntactic information and lexical generalization is studied. The interaction of such information and the role of different geometrical models is investigated on the question classification task where the state-of-the-art result is achieved.