Welfare interface using multiple facial features tracking

  • Authors:
  • Yunhee Shin;Eun Yi Kim

  • Affiliations:
  • Department of Internet and Multimedia Engineering, Konkuk Univ., Korea;Department of Internet and Multimedia Engineering, Konkuk Univ., Korea

  • Venue:
  • AI'06 Proceedings of the 19th Australian joint conference on Artificial Intelligence: advances in Artificial Intelligence
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a welfare interface using multiple facial features tracking, which can efficiently implement various mouse operations. The proposed system consist of five modules: face detection, eye detection, mouth detection, facial features tracking, and mouse control. The facial region is first obtained using skin-color model and connected-component analysis (CCs). Thereafter the eye regions are localized using neural network (NN)-based texture classifier that discriminates the facial region into eye class and non-eye class, and then mouth region is localized using edge detector. Once eye and mouth regions are localized, they are continuously and correctly tracking by mean-shift algorithm and template matching, respectively. Based on the tracking results, mouse operations such as movement or click are implemented. To assess the validity of the proposed system, it was applied to the interface system for web browser and was tested on a group of 25 users. The results show that our system have the accuracy of 99% and process more than 12 frames/sec on PC for the 320(240 size input image, as such it can supply a user-friendly and convenient access to a computer in real-time operation.