Integration of Vision and Inertial Sensors for 3D Arm Motion Tracking in Home-based Rehabilitation

  • Authors:
  • Yaqin Tao; Huosheng Hu; Huiyu Zhou

  • Affiliations:
  • Department of Computer Science, University of Essex,Wivenhoe Park, Colchester CO4 3SQ, U.K.;Department of Computer Science, University of Essex,Wivenhoe Park, Colchester CO4 3SQ, U.K.;Department of Computer Science, University of Essex,Wivenhoe Park, Colchester CO4 3SQ, U.K.

  • Venue:
  • International Journal of Robotics Research
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The integration of visual and inertial sensors for human motion tracking has attracted significant attention recently, due to its robust performance and wide potential application. This paper introduces a real-time hybrid solution to articulated 3D arm motion tracking for home-based rehabilitation by combining visual and inertial sensors. Data fusion is a key issue in this hybrid system and two different data fusion methods are proposed. The first is a deterministic method based on arm structure and geometry information, which is suitable for simple rehabilitation motions. The second is a probabilistic method based on an Extended Kalman Filter (EKF) in which data from two sensors is fused in a predict-correct manner in order to deal with sensor noise and model inaccuracy. Experimental results are presented and compared with commercial marker-based systems, CODA and Qualysis. They show good performance for the proposed solution.