Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Learning human-like knowledge by singular value decomposition: a progress report
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
The Journal of Machine Learning Research
Images, Frames, and Connectionist Hierarchies
Neural Computation
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
Learning the systematic transformation of holographic reduced representations
Cognitive Systems Research
Representing objects, relations, and sequences
Neural Computation
Hi-index | 0.00 |
In this paper, we introduce Linear Relational Embedding as a means of learning a distributed representation of concepts from data consisting of binary relations between these concepts. The key idea is to represent concepts as vectors, binary relations as matrices, and the operation of applying a relation to a concept as a matrix-vector multiplication that produces an approximation to the related concept. A representation for concepts and relations is learned by maximizing an appropriate discriminative goodness function using gradient ascent. On a task involving family relationships, learning is fast and leads to good generalization.