Visual Human-Machine Interaction

  • Authors:
  • Alexander Zelinsky

  • Affiliations:
  • -

  • Venue:
  • AI '99 Proceedings of the 12th Australian Joint Conference on Artificial Intelligence: Advanced Topics in Artificial Intelligence
  • Year:
  • 1999

Quantified Score

Hi-index 0.01

Visualization

Abstract

It is envisaged that computers of the future will have smart interfaces such as speech and vision, which will facilitate natural and easy human-machine intereiction. Gestures of the face and hands could become a natural way to control the operations of a computer or a machine, such as a robot. In this paper, we present a vision-based interface that in real-time tracks a person's facial features and the gaze point of the eyes. The system can robustly track facial features, can detect tracking failures and has an automatic mechanism for error recovery. The system is insensitive to lighting changes and occulsions or distortion of the facial features. The system is user independent eind can automatically calibrate for each different user. An application using this technology for driver fatigue detection eind the evaluation of ergonomic design of motor vehicles has been developed. Our human-machine interface has an enormous potential in other apphcations that allow the control of machines and processes, and measure human performance. For example, product possibilities exist for assisting the disabled and in video game entertainment.