Realistic and robust head-eye coordination of conversational robot actors in human tracking applications

  • Authors:
  • Jartuwat Rajruangrabin;Dan O. Popa

  • Affiliations:
  • University of Texas at Arlington, Forth Worth, TX;University of Texas at Arlington, Forth Worth, TX

  • Venue:
  • Proceedings of the 2nd International Conference on PErvasive Technologies Related to Assistive Environments
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recent advances in computing and robot technology create new opportunities for building robots with increasingly more sophisticated interactivity. One such application is the visual interaction between humans and humanoid in tasks such as mimicking and following. Achieving realistic head-eye motion of the humanoid requires understanding of human kinesiology that dictates the way human coordinate head-eye motion and the ability to control the motion of humanoid to move in the same manner that humans do. In this paper we propose an efficient head-eye motion coordination scheme using an optimization approach - an objective function is formed based on human kinesiology and then optimized for obtaining a realistic head-eye trajectory. The tracking robustness during conversational interaction with a human is further enhanced through a visual feedback scheme, which reduces modelling errors of the humanoid hardware. Experimental results show the tracking efficiency and realism of the motion generated by the proposed scheme with Lilly, a humanoid under development in our lab.