Evaluating facial expressions in american sign language animations for accessible online information

  • Authors:
  • Hernisa Kacorri;Pengfei Lu;Matt Huenerfauth

  • Affiliations:
  • Doctoral Program in Computer Science, The Graduate Center, The City University of New York (CUNY), New York, NY;Doctoral Program in Computer Science, The Graduate Center, The City University of New York (CUNY), New York, NY;Computer Science Department, CUNY Queens College Computer Science and Linguistics Programs, CUNY Graduate Center, The City University of New York (CUNY), Flushing, NY

  • Venue:
  • UAHCI'13 Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: design methods, tools, and interaction techniques for eInclusion - Volume Part I
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Facial expressions and head movements communicate essential information during ASL sentences. We aim to improve the facial expressions in ASL animations and make them more understandable, ultimately leading to better accessibility of online information for deaf people with low English literacy. This paper presents how we engineer stimuli and questions to measure whether the viewer has seen and understood the linguistic facial expressions correctly. In two studies, we investigate how changing several parameters (the variety of facial expressions, the language in which the stimuli were invented, and the degree of involvement of a native ASL signer in the stimuli design) affects the results of a user evaluation study of facial expressions in ASL animation.