Evaluation of the CyberGlove as a whole-hand input device
ACM Transactions on Computer-Human Interaction (TOCHI)
Tessa, a system to aid communication with deaf people
Proceedings of the fifth international ACM conference on Assistive technologies
A Real-Time Continuous Gesture Recognition System for Sign Language
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
An Approach Based on Phonemes to Large Vocabulary Chinese Sign Language Recognition
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Using Multiple Sensors for Mobile Sign Language Recognition
ISWC '03 Proceedings of the 7th IEEE International Symposium on Wearable Computers
'Visual-Fidelity' Dataglove Calibration
CGI '04 Proceedings of the Computer Graphics International
Universal Access in the Information Society
Taiwan sign language (TSL) recognition based on 3D data and neural networks
Expert Systems with Applications: An International Journal
UM'05 Proceedings of the 10th international conference on User Modeling
A dynamic gesture recognition system for the Korean sign language (KSL)
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Modeling animations of American Sign Language verbs through motion-capture of native ASL signers
ACM SIGACCESS Accessibility and Computing
Accurate and Accessible Motion-Capture Glove Calibration for Sign Language Data Collection
ACM Transactions on Accessible Computing (TACCESS)
Collecting a motion-capture corpus of American Sign Language for data-driven generation research
SLPAT '10 Proceedings of the NAACL HLT 2010 Workshop on Speech and Language Processing for Assistive Technologies
Calibration games: making calibration tasks enjoyable by adding motivating game elements
Proceedings of the 24th annual ACM symposium on User interface software and technology
Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
Collecting and evaluating the CUNY ASL corpus for research on American Sign Language animation
Computer Speech and Language
Hi-index | 0.00 |
Motion-capture recordings of sign language are used in research on automatic recognition of sign language or generation of sign language animations, which have accessibility applications for deaf users with low levels of written-language literacy. Motion-capture gloves are used to record the wearer's handshape. Unfortunately, these gloves require a time-consuming and inexact manual calibration process each time they are worn. This paper describes the design and evaluation of a new calibration protocol for motion-capture gloves, which is designed to make the process more efficient and to be accessible for participants who are deaf and use American Sign Language (ASL). The protocol was evaluated experimentally; deaf ASL signers wore the gloves, were calibrated (using the new protocol and using a calibration routine provided by the glove manufacturer), and were asked to perform sequences of ASL handshapes. A native ASL signer rated the correctness and understandability of the collected handshape data. The new protocol received significantly higher scores than the standard calibration. The protocol has been made freely available online, and it includes directions for the researcher, images and videos of how participants move their hands during the process, and directions for participants (as ASL videos and English text).