A system for large vocabulary sign search

  • Authors:
  • Haijing Wang;Alexandra Stefan;Sajjad Moradi;Vassilis Athitsos;Carol Neidle;Farhad Kamangar

  • Affiliations:
  • Computer Science and Engineering Department, University of Texas at Arlington, Arlington, Texas;Computer Science and Engineering Department, University of Texas at Arlington, Arlington, Texas;Computer Science and Engineering Department, University of Texas at Arlington, Arlington, Texas;Computer Science and Engineering Department, University of Texas at Arlington, Arlington, Texas;Linguistics Program, Boston University, Boston, Massachusetts;Computer Science and Engineering Department, University of Texas at Arlington, Arlington, Texas

  • Venue:
  • ECCV'10 Proceedings of the 11th European conference on Trends and Topics in Computer Vision - Volume Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

A method is presented to help users look up the meaning of an unknown sign from American Sign Language (ASL). The user submits a video of the unknown sign as a query, and the system retrieves the most similar signs from a database of sign videos. The user then reviews the retrieved videos to identify the video displaying the sign of interest. Hands are detected in a semi-automatic way: the system performs some hand detection and tracking, and the user has the option to verify and correct the detected hand locations. Features are extracted based on hand motion and hand appearance. Similarity between signs is measured by combining dynamic time warping (DTW) scores, which are based on hand motion, with a simple similarity measure based on hand appearance. In user-independent experiments, with a system vocabulary of 1,113 signs, the correct sign was included in the top 10 matches for 78% of the test queries.