Using a human face in an interface
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Affective Interaction between Humans and Robots
ECAL '01 Proceedings of the 6th European Conference on Advances in Artificial Life
Autonomous Robots
Obtaining a Bayesian map for data fusion and failure detection under uncertainty
IEA/AIE'2005 Proceedings of the 18th international conference on Innovations in Applied Artificial Intelligence
3D robot mapping: combining active and non active sensors in a probabilistic framework
CAEPIA'05 Proceedings of the 11th Spanish association conference on Current Topics in Artificial Intelligence
Hi-index | 0.00 |
This paper presents a cognitive model for an autonomous agent based on emotional psychology and Bayesian programming. A robot with emotional responses allows us to plan behaviour in a different way than present robotic architectures and provides us with a method of generating a new interface for human/robot interaction. The use of emotional modules means that the emotional state of the robot can be obtained directly and, therefore, it is relatively simple to obtain a virtual face that represents these emotions. An autonomous agent could have a model of the environment to be able to interact with the real universe where it is working. It is necessary to consider that any model of a real phenomenon will be incomplete due to the existence of uncertain, unknown variables that influence the phenomenon. Two example arquitectures are proposed here. Using these architectures some experimental data, to verify the correctness of this approach, is provided.