A review of log-polar imaging for visual perception in robotics
Robotics and Autonomous Systems
Levels of embodiment: linguistic analyses of factors influencing hri
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Hi-index | 0.00 |
We describe a method for coordinating eye and neck motions in the control of a humanoid robotic head. Based on the characteristics of human oculomotor behavior, we formulate the target tracking problem in a state-space control framework and show that suitable controller gains can be either manually tuned with optimal control techniques or learned from bio-mechanical data recorded from newborns subjects. The basic controller relies on eye-neck proprioceptive feedback. In biological systems, vestibular signals and target prediction compensate for external motions and allow target tracking with low lag. We provide ways to integrate inertial and prediction signals in the basic control architecture, whenever these are available. We demonstrate the ability of the method in replicating the behavior of subjects with different ages and show results obtained through a real-time implementation in a humanoid platform.