Natural methods for robot task learning: instructive demonstrations, generalization and practice
AAMAS '03 Proceedings of the second international joint conference on Autonomous agents and multiagent systems
Teaching robots by moulding behavior and scaffolding the environment
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
Grounding the lexical semantics of verbs in visual perception using force dynamics and event logic
Journal of Artificial Intelligence Research
Learning to talk about events from narrated video in a construction grammar framework
Artificial Intelligence - Special volume on connecting language to the world
Learning and interacting in human-robot domains
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
The current research provides results from three experiments on the ability of a mobile robot to acquire new behaviors based on the integration of guidance from a human user and its own internal representation of the resulting perceptual and motor events The robot learns to associate perceptual state changes with the conditional initiation and cessation of primitive motor behaviors After several training trials, the system learns to ignore irrelevant perceptual factors, resulting in a robust representation of complex behaviors that require conditional execution based on dynamically changing perceptual states Three experiments demonstrate the robustness of this approach in learning composite perceptual-motor behavioral sequences of varying complexity.