Pattern recognition and synthesis for sign language translation system

  • Authors:
  • M. Ohki;H. Sagawa;T. Sakiyama;E. Oohira;H. Ikeda;H. Fujisawa

  • Affiliations:
  • Central Research Laboratory, Hitachi, Ltd., 1-280, Higashi-koigakubo, Kokubunji-shi, Tokyo 185, Japan;Central Research Laboratory, Hitachi, Ltd., 1-280, Higashi-koigakubo, Kokubunji-shi, Tokyo 185, Japan;Central Research Laboratory, Hitachi, Ltd., 1-280, Higashi-koigakubo, Kokubunji-shi, Tokyo 185, Japan;Central Research Laboratory, Hitachi, Ltd., 1-280, Higashi-koigakubo, Kokubunji-shi, Tokyo 185, Japan;Central Research Laboratory, Hitachi, Ltd., 1-280, Higashi-koigakubo, Kokubunji-shi, Tokyo 185, Japan;Central Research Laboratory, Hitachi, Ltd., 1-280, Higashi-koigakubo, Kokubunji-shi, Tokyo 185, Japan

  • Venue:
  • Assets '94 Proceedings of the first annual ACM conference on Assistive technologies
  • Year:
  • 1994

Quantified Score

Hi-index 0.00

Visualization

Abstract

Sign language is one means of communication for hearing-impaired people. Words and sentences in sign language are mainly represented by hands' gestures. In this report, we show a sign language translation system which we are developing. The system translates Japanese sign language into Japanese and vice versa. In this system, hand shape and position data are inputted using DataGlove. Inputted hand motions are recognized and translated into Japanese sentences. Japanese text is translated into sign language represented as 3-D computer-graphic animation of sign language gestures.