SmartGlove for upper extremities rehabilitative gaming assessment

  • Authors:
  • Ming-Chun Huang;Wenyao Xu;Yi Su;Belinda Lange;Chien-Yen Chang;Majid Sarrafzadeh

  • Affiliations:
  • University of California, Los Angeles, California;University of California, Los Angeles, California;Extension of University of California, Los Angeles, California;University of Southern California, Los Angeles, California;University of Southern California, Los Angeles, California;University of California, Los Angeles, California

  • Venue:
  • Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a quantitative assessment solution for an upper extremities rehabilitative gaming application [1]. This assessment solution consists of a set of stand-alone hardware, including SmartGlove and Kinect, a depth capturing sensor made by Microsoft. SmartGlove is a specially designed motion and finger angle extraction device which is packaged in an easy-to-wear and adjustable manner for a patient with an upper extremity impairment. Sensor data extraction, alignment, and visualization algorithms were designed for integrating hand-mounted sensors data streams into skeleton coordinates captured by the Kinect. This enhanced skeleton information can be summarized and replayed as upper extremity joint coordinate animations which can be used for physical therapists to quantify rehabilitation progress. In addition, to serve as an assessment tool, enhanced skeleton information can be used to extend the capability of the Kinect vision system, such as providing motion capture of the upper extremities, even when the testing subject is out of camera scope or one's upper extremities are occluded by the body.