A Human–Computer Interface Using Symmetry Between Eyes to Detect Gaze Direction

  • Authors:
  • J. J. Magee;M. Betke;J. Gips;M. R. Scott;B. N. Waber

  • Affiliations:
  • Dept. of Comput. Sci., Boston Univ., Boston, MA;-;-;-;-

  • Venue:
  • IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the cases of paralysis so severe that a person's ability to control movement is limited to the muscles around the eyes, eye movements or blinks are the only way for the person to communicate. Interfaces that assist in such communication are often intrusive, require special hardware, or rely on active infrared illumination. A nonintrusive communication interface system called EyeKeys was therefore developed, which runs on a consumer-grade computer with video input from an inexpensive Universal Serial Bus camera and works without special lighting. The system detects and tracks the person's face using multiscale template correlation. The symmetry between left and right eyes is exploited to detect if the person is looking at the camera or to the left or right side. The detected eye direction can then be used to control applications such as spelling programs or games. The game ldquoBlockEscaperdquo was developed to evaluate the performance of EyeKeys and compare it to a mouse substitution interface. Experiments with EyeKeys have shown that it is an easily used computer input and control device for able-bodied people and has the potential to become a practical tool for people with severe paralysis.