A hierarchical neural network model for selective attention
Neural Computers
Review of neural networks for speech recognition
Neural Computation
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Metamagical Themas: Questing for the Essence of Mind and Pattern
Metamagical Themas: Questing for the Essence of Mind and Pattern
Journal of Cognitive Neuroscience
Separating Style and Content with Bilinear Models
Neural Computation
Hi-index | 0.00 |
Generalization for production is a difficult problem solved by intelligent systems. One such problem might be ''paint a portrait of George Bush in the style of Vincent van Gogh.'' In these tasks, a low dimensional representation (''George Bush'' and ''style of van Gogh'') is expanded into a (nonunique) output of extremely high dimension, for instance represented by color, line, shading, etc. We designed a connectionist network for generalization of production in such a way-to generate letterforms in a new font given just a few exemplars from that font. During learning, our network constructs distributed internal representations of different fonts and letters, even though each training instance had both font characteristics and letter characteristics. Separate hidden representations for ''letter'' and ''font'' were necessary for success of the network. The integration of information from font and letter representations had to be at a proper, intermediate level of abstraction. The limitations of the network can be attributed, in part, to limited training corpus, and lack of translation and scale invariances.