Eye tracking in web search tasks: design implications
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Eye tracking insights into cognitive modeling
Proceedings of the 2006 symposium on Eye tracking research & applications
An eye tracking study of the effect of target rank on web search
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A minimal model for predicting visual search in human-computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Evaluation of American Sign Language Generation by Native ASL Signers
ACM Transactions on Accessible Computing (TACCESS)
Evaluation of a psycholinguistically motivated timing model for animations of american sign language
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
Automated eye-movement protocol analysis
Human-Computer Interaction
ClassInFocus: enabling improved visual attention strategies for deaf and hard of hearing students
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
Collecting a motion-capture corpus of American Sign Language for data-driven generation research
SLPAT '10 Proceedings of the NAACL HLT 2010 Workshop on Speech and Language Processing for Assistive Technologies
The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility
Effect of spatial reference and verb inflection on the usability of sign language animations
Universal Access in the Information Society
Effect of presenting video as a baseline during an american sign language animation user study
Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
Evaluating facial expressions in american sign language animations for accessible online information
UAHCI'13 Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: design methods, tools, and interaction techniques for eInclusion - Volume Part I
Hi-index | 0.00 |
Animations of American Sign Language (ASL) have accessibility benefits for signers with lower written-language literacy. Our lab has conducted prior evaluations of synthesized ASL animations: asking native signers to watch different versions of animations and answer comprehension and subjective questions about them. Seeking an alternative method of measuring users' reactions to animations, we are now investigating the use of eye tracking to understand how users perceive our stimuli. This study quantifies how the eye gaze of native signers varies when they view: videos of a human ASL signer or synthesized animations of ASL (of different levels of quality). We found that, when viewing videos, signers spend more time looking at the face and less frequently move their gaze between the face and body of the signer. We also found correlations between these two eye-tracking metrics and participants' responses to subjective evaluations of animation-quality. This paper provides methodological guidance for how to design user studies evaluating sign language animations that include eye tracking, and it suggests how certain eye-tracking metrics could be used as an alternative or complimentary form of measurement in evaluation studies of sign language animation.