Geometry and Meaning
Inducing ontological co-occurrence vectors
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
In Search of Semantic Compositionality in Vector Spaces
ICCS '09 Proceedings of the 17th International Conference on Conceptual Structures: Conceptual Structures: Leveraging Semantic Technologies
IJCAI'91 Proceedings of the 12th international joint conference on Artificial intelligence - Volume 1
The S-Space package: an open source package for word space models
ACLDemos '10 Proceedings of the ACL 2010 System Demonstrations
From frequency to meaning: vector space models of semantics
Journal of Artificial Intelligence Research
A regression model of adjective-noun compositionality in distributional semantics
GEMS '10 Proceedings of the 2010 Workshop on GEometrical Models of Natural Language Semantics
EMNLP '10 Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing
Exemplar-based word-space model for compositionality detection: shared task system description
DiSCo '11 Proceedings of the Workshop on Distributional Semantics and Compositionality
A context-theoretic framework for compositionality in distributional semantics
Computational Linguistics
Modeling covert event retrieval in logical metonymy: probabilistic and distributional accounts
CMCL '12 Proceedings of the 3rd Workshop on Cognitive Modeling and Computational Linguistics
Distributional techniques for philosophical enquiry
LaTeCH '12 Proceedings of the 6th Workshop on Language Technology for Cultural Heritage, Social Sciences, and Humanities
Hi-index | 0.00 |
This article introduces and evaluates an approach to semantic compositionality in computational linguistics based on the combination of Distributional Semantics and supervised Machine Learning. In brief, distributional semantic spaces containing representations for complex constructions such as Adjective-Noun and Verb-Noun pairs, as well as for their constituent parts, are built. These representations are then used as feature vectors in a supervised learning model using multivariate multiple regression. In particular, the distributional semantic representations of the constituents are used to predict those of the complex structures. This approach outperforms the rivals in a series of experiments with Adjective-Noun pairs extracted from the BNC. In a second experimental setting based on Verb-Noun pairs, a comparatively much lower performance was obtained by all the models; however, the proposed approach gives the best results in combination with a Random Indexing semantic space.