Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Human computer interaction based on hand gesture ontology
ICCOMP'07 Proceedings of the 11th WSEAS International Conference on Computers
Synthetic data generation technique in Signer-independent sign language recognition
Pattern Recognition Letters
Modelling and recognition of the linguistic components in American Sign Language
Image and Vision Computing
SOMM: Self organizing Markov map for gesture recognition
Pattern Recognition Letters
Hand gesture recognition based on SOM and ART
ICCOMP'06 Proceedings of the 10th WSEAS international conference on Computers
Mapping multi-spectral remote sensing images using rule extraction approach
Expert Systems with Applications: An International Journal
Recognition of complex human behaviors in pool environment using foreground silhouette
ISVC'05 Proceedings of the First international conference on Advances in Visual Computing
A natural interface for sign language mathematics
ISVC'06 Proceedings of the Second international conference on Advances in Visual Computing - Volume Part I
Hand Gesture Recognition Using Multivariate Fuzzy Decision Tree and User Adaptation
International Journal of Fuzzy System Applications
Attention Based Detection and Recognition of Hand Postures Against Complex Backgrounds
International Journal of Computer Vision
Non-manual cues in automatic sign language recognition
Personal and Ubiquitous Computing
Prediction of survival of ICU patients using computational intelligence
Computers in Biology and Medicine
Hi-index | 0.00 |
Gesture based applications widely range from replacing the traditional mouse as a position device to virtual reality and communication with the deaf. The article presents a fuzzy rule based approach to spatio-temporal hand gesture recognition. This approach employs a powerful method based on hyperrectangutar composite neural networks (HRCNNs) for selecting templates. Templates for each hand shape are represented in the form of crisp IF-THEN rules that are extracted from the values of synaptic weights of the corresponding trained HRCNNs. Each crisp IF-THEN rule is then fuzzified by employing a special membership function in order to represent the degree to which a pattern is similar to the corresponding antecedent part. When an unknown gesture is to be classified, each sample of the unknown gesture is tested by each fuzzy rule. The accumulated similarity associated with all samples of the input is computed for each hand gesture in the vocabulary, and the unknown gesture is classified as the gesture yielding the highest accumulative similarity. Based on the method we can implement a small-sized dynamic hand gesture recognition system. Two databases which consisted of 90 spatio-temporal hand gestures are utilized for verifying its performance. An encouraging experimental result confirms the effectiveness of the proposed method