Shape and motion from image streams under orthography: a factorization method
International Journal of Computer Vision
Fundamentals of Robot Technology
Fundamentals of Robot Technology
Phylogenetic and Ontogenetic Learning in a Colony of Interacting Robots
Autonomous Robots
Interactive Simulation of Solid Rigid Bodies
IEEE Computer Graphics and Applications
ELVIS: Eigenvectors for Land Vehicle Image System
IROS '95 Proceedings of the International Conference on Intelligent Robots and Systems-Volume 1 - Volume 1
Vision and force driven sensorimotor primitives for robotic assembly skills
IROS '95 Proceedings of the International Conference on Intelligent Robots and Systems-Volume 3 - Volume 3
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
From embodied to socially embedded agents - Implications for interaction-aware robots
Cognitive Systems Research
Hi-index | 0.00 |
Gesture-based programming is a paradigm for programming robots by human demonstration in which the human demonstrator directs the self-adaptation of executable software. The goal is to provide a more natural environment for the user as programmer and to generate more complete and successful programs by focusing on task experts rather than programming experts. We call the paradigm "gesture-based" because we try to enable the system to capture, in real-time, the intention behind the demonstrator's fleeting, context-dependent hand motions, contact conditions, finger poses, and even cryptic utterances in order to reconfigure itself. The system is self-adaptive in the sense that knowledge of previously acquired skills (sensorimotor expertise) is retained by the system and this knowledge facilitates the interpretation of the gestures during training and then provides feedback control during runtime.