Computer interface using eye tracking for handicapped people

  • Authors:
  • Eun Yi Kim;Se Hyun Park

  • Affiliations:
  • Department of Internet and Multimedia Engineering, Konkuk Univ., Korea;School of Computer and Communication, Daegu Univ., Korea

  • Venue:
  • IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a computer interface for handicapped people is proposed, where input signals are given by eye movement of the handicapped people. Eye movement is detected by neural network (NN)-based texture classifier, which enables our system to be not obliged to constrained environment. To be robust the natural motion of a user, we first detect a user’s face using skin-color information, and then detect her or his eyes using neural network (NN)-based texture classifier. After detection of eye movements, the tracking is performed using mean-shift algorithms. We use this eye-tracking system as an interface to control the surrounding system such as audio, TV, light, phone, and so on. The experimental results verify the feasibility and validity of the proposed eye-tracking system to be applicable as an interface for the handicapped people.