Extracting the movement of lip and tongue during articulation

  • Authors:
  • Hanhoon Park;Seung-Wook Hong;Jong-Il Park;Sung-Kyun Moon;Hyeongseok Ko

  • Affiliations:
  • Division of Electrical and Computer Engineering, Hanyang University, Seoul, Korea;Division of Electrical and Computer Engineering, Hanyang University, Seoul, Korea;Division of Electrical and Computer Engineering, Hanyang University, Seoul, Korea;Department of Otolaryngology, Ajou University, Suwon, Korea;School of Electrical Engineering, Seoul National University, Seoul, Korea

  • Venue:
  • PCM'05 Proceedings of the 6th Pacific-Rim conference on Advances in Multimedia Information Processing - Volume Part I
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

A method that extracts the 3-D shape and movement of lip and tongue and displays them simultaneously is presented. Lip movement is easily observable and thus extractable using a camera. However, it is difficult to extract the real movement of tongue exactly because the tongue may be occluded by the lip and teeth. In this paper, we use a magnetic resonance imaging (MRI) device to extract the sagittal view of the movement of tongue during articulation. Since the frame rate of the available MRI device is very low (5 fps), we obtain a smooth video sequence (20 fps) by a new contour-based interpolation method. The overall procedure of extracting the movement of lip and tongue is as follows. First, fiducial color markers attached on the lip are detected, and then the data of 3D movement of the lip are computed using a 3D reconstruction technique. Next, to extract the movement of tongue image, we applied a series of simple image processing algorithms to MRI images of tongue and then extracted the contour of tongue interactively. Finally, the data of lip and tongue are synchronized and temporally interpolated. An OpenGL based program is implemented to visualize the data interactively. We performed the experiment using the Korean basic syllables and some of the data are presented. It is confirmed that a lot of experiments using the results support theoretical and empirical observation of linguistics. The acquired data can be used not only as a fundamental database for scientific purpose but also as an educative material for language rehabilitation of the hearing-impaired. Also it can be used for making a high-quality lip-synchronized animation including tongue movement.