Computer-Assisted Lip Reading Recognition for Hearing Impaired

  • Authors:
  • Yun-Long Lay;Hui-Jen Yang;Chern-Sheng Lin

  • Affiliations:
  • Department of Electronic Engineering, National Chin-Yi University of Technology,;Department of Information Management, National Chin-Yi University of Technology, Taiping City, Taiwan 411;Department of Automatic Control Engineering, Feng Chia University, Taichung, Taiwan 40724

  • Venue:
  • UAHCI '09 Proceedings of the 5th International Conference on Universal Access in Human-Computer Interaction. Part III: Applications and Services
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Within the communication process of human beings, the speaker's facial expression and lip-shape movement contains extremely rich language information. The hearing impaired, aside from using residual listening to communicate with other people, can also use lip reading as a communication tool. As the hearing impaired learn the lip reading using a computer-assisted lip reading system, they can freely learn lip reading without the constraints of time, place or situation. Therefore, we propose a computer-assisted lip reading system (CALRS) for phonetic pronunciation recognition of the correct lip-shape with an image processing method, object-oriented language and neuro-network. This system can accurately compare the lip-image of Mandarin phonetic pronunciation using Self-Organizing Map Neuro-Network (SOMNN) and extension theory to help hearing impaired correct their pronunciation.