Evaluating the perceptual realism of animated facial expressions
ACM Transactions on Applied Perception (TAP)
Universal Access in the Information Society
A knowledge-based sign synthesis architecture
Universal Access in the Information Society
Evaluation of American Sign Language Generation by Native ASL Signers
ACM Transactions on Accessible Computing (TACCESS)
Evaluation of a psycholinguistically motivated timing model for animations of american sign language
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
Speech to sign language translation system for Spanish
Speech Communication
ACM Transactions on Interactive Intelligent Systems (TiiS)
The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility
Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
Effect of Displaying Human Videos During an Evaluation Study of American Sign Language Animation
ACM Transactions on Accessible Computing (TACCESS)
Hi-index | 0.00 |
Facial expressions and head movements communicate essential information during ASL sentences. We aim to improve the facial expressions in ASL animations and make them more understandable, ultimately leading to better accessibility of online information for deaf people with low English literacy. This paper presents how we engineer stimuli and questions to measure whether the viewer has seen and understood the linguistic facial expressions correctly. In two studies, we investigate how changing several parameters (the variety of facial expressions, the language in which the stimuli were invented, and the degree of involvement of a native ASL signer in the stimuli design) affects the results of a user evaluation study of facial expressions in ASL animation.