An Assistive Body Sensor Network Glove for Speech- and Hearing-Impaired Disabilities
BSN '11 Proceedings of the 2011 International Conference on Body Sensor Networks
Proceedings of the 2011 international conference on Virtual and mixed reality: systems and applications - Volume Part II
Robust hand gesture recognition based on finger-earth mover's distance with a commodity depth camera
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Gaming for upper extremities rehabilitation
Proceedings of the 2nd Conference on Wireless Health
Inconspicuous on-bed respiratory rate monitoring
Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments
mCOPD: mobile phone based lung function diagnosis and exercise system for COPD
Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments
See UV on your skin: an ultraviolet sensing and visualization system
BodyNets '13 Proceedings of the 8th International Conference on Body Area Networks
Hi-index | 0.00 |
This paper presents a quantitative assessment solution for an upper extremities rehabilitative gaming application [1]. This assessment solution consists of a set of stand-alone hardware, including SmartGlove and Kinect, a depth capturing sensor made by Microsoft. SmartGlove is a specially designed motion and finger angle extraction device which is packaged in an easy-to-wear and adjustable manner for a patient with an upper extremity impairment. Sensor data extraction, alignment, and visualization algorithms were designed for integrating hand-mounted sensors data streams into skeleton coordinates captured by the Kinect. This enhanced skeleton information can be summarized and replayed as upper extremity joint coordinate animations which can be used for physical therapists to quantify rehabilitation progress. In addition, to serve as an assessment tool, enhanced skeleton information can be used to extend the capability of the Kinect vision system, such as providing motion capture of the upper extremities, even when the testing subject is out of camera scope or one's upper extremities are occluded by the body.