Human behavior recognition by a bio-monitoring mobile robot

  • Authors:
  • Myagmarbayar Nergui;Yuki Yoshida;Nevrez Imamoglu;Jose Gonzalez;Wenwei Yu

  • Affiliations:
  • Medical System Engineering Department, Graduate School of Engineering, Chiba University, Chiba, Japan;Medical System Engineering Department, Graduate School of Engineering, Chiba University, Chiba, Japan;Medical System Engineering Department, Graduate School of Engineering, Chiba University, Chiba, Japan;Medical System Engineering Department, Graduate School of Engineering, Chiba University, Chiba, Japan;Medical System Engineering Department, Graduate School of Engineering, Chiba University, Chiba, Japan

  • Venue:
  • ICIRA'12 Proceedings of the 5th international conference on Intelligent Robotics and Applications - Volume Part II
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Our ultimate goal is to develop autonomous mobile home healthcare robots which closely monitor and evaluate the patients' motor function, and their at-home training therapy process, providing automatically calling for medical personnel in emergency situations. The robots to be developed will bring about cost-effective, safe and easier at-home rehabilitation to most motor-function impaired patients (MIPs), and meanwhile, relieve therapists from great burden in canonical rehabilitation. In order to achieve our ultimate goal, we have developed following programs/algorithms for monitoring subject activities and recognizing human behaviors. 1) Control programs for a mobile robot to track and follow human by three different viewpoints 2) Algorithms for measuring and analyzing of lower limb joints angle from RGB-D images from a Kinect sensor located at the mobile robot, and 3) Algorithms for recognizing gait gesture. In 2), compensation with colored marks was implemented to deal with the joint trajectory error caused by mixing-up and frame flying during tracking and following human movement by the mobile robot. In 3), We have proposed a Hidden Markov Model (HMM) based human behavior recognition using lower limb joint angles and body angle. Experiment results showed that, joint trajectory could be analyzed with high accuracy compared to a motion tracking system, and human behavior could be recognized from the joint trajectory.