Hand gesture recognition based on dynamic Bayesian network framework
Pattern Recognition
A person independent system for recognition of hand postures used in sign language
Pattern Recognition Letters
Hand trajectory-based gesture spotting and recognition using HMM
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Feature extraction based on multiphase level set framework for sign language recognition system
Proceedings of the 1st Amrita ACM-W Celebration on Women in Computing in India
Multifactor feature extraction for human movement recognition
Computer Vision and Image Understanding
International Journal of Computational Vision and Robotics
The Journal of Machine Learning Research
MLDM'13 Proceedings of the 9th international conference on Machine Learning and Data Mining in Pattern Recognition
Multi-modal social signal analysis for predicting agreement in conversation settings
Proceedings of the 15th ACM on International conference on multimodal interaction
Hidden Markov Model on a unit hypersphere space for gesture trajectory recognition
Pattern Recognition Letters
Rule-based trajectory segmentation for modeling hand motion trajectory
Pattern Recognition
International Journal of Computational Vision and Robotics
Hi-index | 0.14 |
Sign language spotting is the task of detecting and recognizing signs in a signed utterance, in a set vocabulary. The difficulty of sign language spotting is that instances of signs vary in both motion and appearance. Moreover, signs appear within a continuous gesture stream, interspersed with transitional movements between signs in a vocabulary and nonsign patterns (which include out-of-vocabulary signs, epentheses, and other movements that do not correspond to signs). In this paper, a novel method for designing threshold models in a conditional random field (CRF) model is proposed which performs an adaptive threshold for distinguishing between signs in a vocabulary and nonsign patterns. A short-sign detector, a hand appearance-based sign verification method, and a subsign reasoning method are included to further improve sign language spotting accuracy. Experiments demonstrate that our system can spot signs from continuous data with an 87.0 percent spotting rate and can recognize signs from isolated data with a 93.5 percent recognition rate versus 73.5 percent and 85.4 percent, respectively, for CRFs without a threshold model, short-sign detection, subsign reasoning, and hand appearance-based sign verification. Our system can also achieve a 15.0 percent sign error rate (SER) from continuous data and a 6.4 percent SER from isolated data versus 76.2 percent and 14.5 percent, respectively, for conventional CRFs.