Invariant Range Image Multi-Pose Face Recognition Using Gradient Face, Membership Matching Score and 3-Layer Matching Search

  • Authors:
  • Seri Pansang;Boonwat Attachoo;Chom Kimpan;Makoto Sato

  • Affiliations:
  • The authors are with the Department of Computer Engineering, Faculty of Engineering, King Mongkut's Institute of Technology Ladkrabang (KMITL), Thailand. E-mail: seri@academic.cmru.ac.th, E-mail: ...;The authors are with the Department of Computer Engineering, Faculty of Engineering, King Mongkut's Institute of Technology Ladkrabang (KMITL), Thailand. E-mail: seri@academic.cmru.ac.th, E-mail: ...;The author is with the Department of Informatics, Sripatum University, Thailand. E-mail: chomk@spu.ac.th,;The author is with the Department of Physical Electronics, Faculty of Engineering, P & I Laboratory, Tokyo Institute of Technology, Yokohama-shi, 226-8503 Japan.

  • Venue:
  • IEICE - Transactions on Information and Systems
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

The purpose of this paper is to present the novel technique to solve the recognition errors in invariant range image multi-pose face recognition. The scale, center and pose error problems were solved by using the geometric transform [13]. Range image face data (RIFD) was obtained from a laser range finder and was used in the model to generate multi-poses. Each pose data size was reduced by linear reduction. The reduced RIFD was transformed to the gradient face model for facial feature image extraction and also for matching using the Membership Matching Score model. Using this method, the results from the experiment are acceptable although the size of gradient face image data is quite small (659 elements). Three-Layer Matching Search was the algorithm designed to reduce the access timing to the most accurate and similar pose position. The proposed algorithm was tested using facial range images from 130 people with normal facial expressions and without eyeglasses. The results achieved the mean success rate of 95.67 percent of ±12 degrees up/down and left/right (UDLR) and 88.35 percent of ±24 degrees UDLR.