Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Image and video for hearing impaired people
Journal on Image and Video Processing
Sequential Belief-Based Fusion of Manual and Non-manual Information for Recognizing Isolated Signs
Gesture-Based Human-Computer Interaction and Simulation
Head gesture recognition based on bayesian network
IbPRIA'05 Proceedings of the Second Iberian conference on Pattern Recognition and Image Analysis - Volume Part I
Hi-index | 0.00 |
An automated system for detection of head movements is described. The goal is to label relevant head gestures in video of American Sign Language (ASL) communication. In the system, a 3D head tracker recovers head rotation and translation parameters from monocular video. Relevant head gestures are then detected by analyzing the length and frequency of the motion signal's peaks and valleys. Each parameter is analyzed independently, due to the fact that a number of relevant head movements in ASL are associated with major changes around one rotational axis. No explicit training of the system is necessary. Currently, the system can detect "head shakes." In experimental evaluation, classification performance is compared against ground-truth labels obtained from ASL linguists. Initial results are promising, as the system matches the linguists' labels in a significant number of cases.