Transforming human hand motion for telemanipulation
Presence: Teleoperators and Virtual Environments - Premier issue
Perception and Developmental Learning of Affordances in Autonomous Robots
KI '07 Proceedings of the 30th annual German conference on Advances in Artificial Intelligence
Learning object-specific grasp affordance densities
DEVLRN '09 Proceedings of the 2009 IEEE 8th International Conference on Development and Learning
Learning grasping affordances from local visual descriptors
DEVLRN '09 Proceedings of the 2009 IEEE 8th International Conference on Development and Learning
An intrinsic reward for affordance exploration
DEVLRN '09 Proceedings of the 2009 IEEE 8th International Conference on Development and Learning
Learning Object Affordances: From Sensory--Motor Coordination to Imitation
IEEE Transactions on Robotics
Intrinsic Motivation Systems for Autonomous Mental Development
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
In this paper, we present an affordance learning system for robotic grasping. The system involves three important aspects: the affordance memory, synergy-based exploration, and a grasping control strategy using local sensor feedback. The affordance memory is modeled with a modified growing neural gas network that allows affordances to be learned quickly from a small dataset of human grasping and object features. After being trained offline, the affordance memory is used in the system to generate online motor commands for reaching and grasping control of the robot. When grasping new objects, the system can explore various grasp postures efficiently in the low dimensional synergy space because the synergies automatically avoid abnormal postures that are more likely to lead to failed grasps. Experimental results demonstrated that the affordance memory can generalize to grasp new objects and predict the effect of the grasp (i.e., the tactile patterns).