A Hippocampal Model of Visually Guided Navigation as Implemented by a Mobile Agent
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 2 - Volume 2
Correspondence Mapping Induced State and Action Metrics for Robotic Imitation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Learning and communication via imitation: an autonomous robot perspective
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Imitation with ALICE: learning to imitate corresponding actions across dissimilar embodiments
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Dynamical neural networks for planning and low-level robot control
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
Recent works have addressed the problem of imitation in the framework of the interactions between two agents, whether humans or robots. We develop a model aiming at improving the self-organization of population of robots by relying on imitation. Imitations between the robots are regulated by a very simple model of emotional expression. The model is tested in the context of a simple task for the robots: to explore their environment to localize sources needed for their survival. Following a biology-inspired approach, imitation has been introduced within a population of autonomous agents, as bidirectional social needs, in line with the Maslow's Pyramid of needs [1]. In our model, imitation is integrated into a global architecture based on artificial neural networks. Running our simple and scalable model resulted in a significant increase of the population's survival rate and a decrease of the global amount of the average necessary movements of each agent.