Controlling computer by lip gestures employing neural networks

  • Authors:
  • Piotr Dalka;Andrzej Czyżewski

  • Affiliations:
  • Gdansk University of Technology, Multimedia Systems Department, Gdansk, Poland;Gdansk University of Technology, Multimedia Systems Department, Gdansk, Poland

  • Venue:
  • RSCTC'10 Proceedings of the 7th international conference on Rough sets and current trends in computing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Results of experiments regarding lip gesture recognition with an artificial neural network are discussed. The neural network module forms the core element of a multimodal human-computer interface called LipMouse. This solution allows a user to work on a computer using lip movements and gestures. A user face is detected in a video stream from a standard web camera using a cascade of boosted classifiers working with Haar-like features. Lip region extraction is based on a lip shape approximation calculated by the means of lip image segmentation using fuzzy clustering. ANN is fed with a feature vector describing lip region appearance. The descriptors used include a luminance histogram, statistical moments and co-occurrence matrices statistical parameters. ANN is able to recognize with a good accuracy three lip gestures: mouth opening, sticking out the tongue and forming puckered lips.