Application of Cascade Correlation Networks for Structures toChemistry
Applied Intelligence
IEEE Transactions on Knowledge and Data Engineering
Integrating Linguistic Primitives in Learning Context-Dependent Representation
IEEE Transactions on Knowledge and Data Engineering
Symbolic vs. Connectionist Learning: An Experimental Comparison in a Structured Domain
IEEE Transactions on Knowledge and Data Engineering
Clustering and Classification in Structured Data Domains Using Fuzzy Lattice Neurocomputing (FLN)
IEEE Transactions on Knowledge and Data Engineering
Representation and Processing of Structures with Binary Sparse Distributed Codes
IEEE Transactions on Knowledge and Data Engineering
Representation and extrapolation in multilayer perceptrons
Neural Computation
Images, Frames, and Connectionist Hierarchies
Neural Computation
Component-based visual clustering using the self-organizing map
Neural Networks
Language models based on semantic composition
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1
Syntactic and semantic factors in processing difficulty: an integrated measure
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Compositional matrix-space models of language
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Distributed representations to detect higher order term correlations in textual content
RSCTC'10 Proceedings of the 7th international conference on Rough sets and current trends in computing
Compositional expectation: a purely distributional model of compositional semantics
IWCS '11 Proceedings of the Ninth International Conference on Computational Semantics
Introducing scalable quantum approaches in language representation
QI'11 Proceedings of the 5th international conference on Quantum interaction
Enhancing clinical concept extraction with distributional semantics
Journal of Biomedical Informatics
Domain and function: a dual-space model of semantic relations and compositions
Journal of Artificial Intelligence Research
A comparison of vector-based representations for semantic composition
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Semantic compositionality through recursive matrix-vector spaces
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
A dynamic binding mechanism for retrieving and unifying complex predicate-logic knowledge
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Hi-index | 0.00 |
Associative memories are conventionally used to represent data with very simple structure: sets of pairs of vectors. This paper describes a method for representing more complex compositional structure in distributed representations. The method uses circular convolution to associate items, which are represented by vectors. Arbitrary variable bindings, short sequences of various lengths, simple frame-like structures, and reduced representations can be represented in a fixed width vector. These representations are items in their own right and can be used in constructing compositional structures. The noisy reconstructions extracted from convolution memories can be cleaned up by using a separate associative memory that has good reconstructive properties