Optimization by mean field annealing
Advances in neural information processing systems 1
CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
Neural networks and intellect: using model-based concepts
Neural networks and intellect: using model-based concepts
Symbol grounding and the symbolic theft hypothesis
Simulating the evolution of language
Natural language from artificial life
Artificial Life
A cross-situational algorithm for learning a lexicon using neural modeling fields
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Evolution of Languages, Consciousness and Cultures
IEEE Computational Intelligence Magazine
Integrating Language and Cognition: A Cognitive Robotics Approach
IEEE Computational Intelligence Magazine
Evolution of communication and language using signals, symbols, andwords
IEEE Transactions on Evolutionary Computation
Evolving Compositionality in Evolutionary Language Games
IEEE Transactions on Evolutionary Computation
Neural mechanisms of the mind, Aristotle, Zadeh, and fMRI
IEEE Transactions on Neural Networks
Language and cognition interaction neural mechanisms
Computational Intelligence and Neuroscience
Hi-index | 0.00 |
The issue of how children learn the meaning of words is fundamental to developmental psychology. The recent attempts to develop or evolve efficient communication protocols among interacting robots or virtual agents have brought that issue to a central place in more applied research fields, such as computational linguistics and neural networks, as well. An attractive approach to learning an object-word mapping is the so-called cross-situational learning. This learning scenario is based on the intuitive notion that a learner can determine the meaning of a word by finding something in common across all observed uses of that word. Here we show how the deterministic Neural Modeling Fields (NMF) categorization mechanism can be used by the learner as an efficient algorithm to infer the correct object-word mapping. To achieve that we first reduce the original on-line learning problem to a batch learning problem where the inputs to the NMF mechanism are all possible object-word associations that could be inferred from the cross-situational learning scenario. Since many of those associations are incorrect, they are considered as clutter or noise and discarded automatically by a clutter detector model included in our NMF implementation. With these two key ingredients-batch learning and clutter detection-the NMF mechanism was capable to infer perfectly the correct object-word mapping.