Mapping part-whole hierarchies into connectionist networks
Artificial Intelligence - On connectionist symbol processing
Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Distributed Representations, Simple Recurrent Networks, And Grammatical Structure
Machine Learning - Connectionist approaches to language learning
Automatic word sense discrimination
Computational Linguistics - Special issue on word sense disambiguation
Automatic retrieval and clustering of similar words
COLING '98 Proceedings of the 17th international conference on Computational linguistics - Volume 2
Accurate unlexicalized parsing
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Generating query substitutions
Proceedings of the 15th international conference on World Wide Web
ACL '05 Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics
Dependency-Based Construction of Semantic Space Models
Computational Linguistics
A unified architecture for natural language processing: deep neural networks with multitask learning
Proceedings of the 25th international conference on Machine learning
Broad-coverage sense disambiguation and information extraction with a supersense sequence tagger
EMNLP '06 Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing
Dependency tree-based sentiment classification using CRFs with hidden variables
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Compositional matrix-space models of language
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
SemEval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals
SemEval '10 Proceedings of the 5th International Workshop on Semantic Evaluation
UTD: Classifying semantic relations by combining lexical and semantic resources
SemEval '10 Proceedings of the 5th International Workshop on Semantic Evaluation
From frequency to meaning: vector space models of semantics
Journal of Artificial Intelligence Research
EMNLP '10 Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing
Estimating linear models for compositional distributional semantics
COLING '10 Proceedings of the 23rd International Conference on Computational Linguistics
Distributional memory: A general framework for corpus-based semantics
Computational Linguistics
Local and global algorithms for disambiguation to Wikipedia
HLT '11 Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies - Volume 1
Integrating logical representations with probabilistic information using Markov logic
IWCS '11 Proceedings of the Ninth International Conference on Computational Semantics
Semi-supervised recursive autoencoders for predicting sentiment distributions
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Compositional matrix-space models for sentiment analysis
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Experimental support for a categorical compositional distributional model of meaning
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Holographic reduced representations
IEEE Transactions on Neural Networks
Peer and self assessment in massive online classes
ACM Transactions on Computer-Human Interaction (TOCHI)
Learning deep structured semantic models for web search using clickthrough data
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Using natural language to integrate, evaluate, and optimize extracted knowledge bases
Proceedings of the 2013 workshop on Automated knowledge base construction
Hi-index | 0.00 |
Single-word vector space models have been very successful at learning lexical information. However, they cannot capture the compositional meaning of longer phrases, preventing them from a deeper understanding of language. We introduce a recursive neural network (RNN) model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length. Our model assigns a vector and a matrix to every node in a parse tree: the vector captures the inherent meaning of the constituent, while the matrix captures how it changes the meaning of neighboring words or phrases. This matrix-vector RNN can learn the meaning of operators in propositional logic and natural language. The model obtains state of the art performance on three different experiments: predicting fine-grained sentiment distributions of adverb-adjective pairs; classifying sentiment labels of movie reviews and classifying semantic relationships such as cause-effect or topic-message between nouns using the syntactic path between them.