Modular control for human motion analysis and classification in human-robot interaction

  • Authors:
  • Juan Alberto Rivera-Bautista;Ana Cristina Ramirez-Hernandez;Virginia A. Garcia-Vega;Antonio Marin-Hernandez

  • Affiliations:
  • Universidad Veracruzana, Xalapa, Mexico;Universidad Veracruzana, Xalapa, Mexico;Universidad Veracruzana, Xalapa, Mexico;Universidad Veracruzana, Xalapa, Mexico

  • Venue:
  • Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Trajectories followed by the humans can be interpreted as an attitude gesture. Based on this interpretation an autonomous mobile robot can decide how to initiate interaction with a given human. In this work is presented a modular control system to analyze human walking trajectories in order to engage a robot on a Human-Robot interaction. When the robot detects a human with their vision system a visual tracking module begins to work over the Pan/Tilt/Zoom (PTZ) camera unit. Camera parameters configuration and global robot localization are then used by another module to filter and track human's legs over the laser range finder (LRF) data. Path followed by the human over the global reference frame is then processed by another module which determines the kind of attitude showed by the human. Based on the result the robot decides if an interaction is needed and who is expected to begin it. At this moment are used only three kinds of attitudes: confidence, curiosity and nervousness.