Collecting and evaluating the CUNY ASL corpus for research on American Sign Language animation

  • Authors:
  • Pengfei Lu;Matt Huenerfauth

  • Affiliations:
  • -;-

  • Venue:
  • Computer Speech and Language
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

While there is great potential for sign language animation generation software to improve the accessibility of information for deaf individuals with low written-language literacy, the understandability of current sign language animation systems is limited. Data-driven methodologies using annotated sign language corpora encoding detailed human movement have enabled some researchers to address several key linguistic challenges in ASL generation. This article motivates and describes our current research on collecting a motion-capture corpus of American Sign Language (ASL). As an evaluation of our motion-capture configuration, calibration, and recording protocol, we have conducted several rounds of evaluation studies with native ASL signers, and we have made use of our collected data to synthesize novel animations of ASL, which have also been evaluated in experimental studies with native signers.