Gesture-based interaction with a pet robot
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Tracking the human arm using constraint fusion and multiple-cue localization
Machine Vision and Applications
Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Human-centered interaction with documents
Proceedings of the 1st ACM international workshop on Human-centered multimedia
Body posture estimation in sign language videos
GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
Hi-index | 0.00 |
In this paper we describe a video-based analysis system for acquisition and classification of hand-arm motion concerning German sign language. These motions are recorded with a single video camera by use of a modular framegrabber system. Data acquisition as well as motion classification are performed in real-time. A colour coded glove and coloured markers at the elbow and shoulder are used. These markers are segmented from the recorded input images as a first step of image processing. Thereafter features of these coloured areas are calculated which are used for determining the 2D positions for each frame and hence the positions of hand and arm. The missing third dimension is derived from a geometric model of the human hand-arm system. The sequence of the position data is converted into a certain representation of motion. Motion is derived from rule-based classification of the performed gesture, which yields a recognition rate of 95%.