Hand gesture-based visual user interface for infotainment

  • Authors:
  • Eshed Ohn-Bar;Cuong Tran;Mohan Trivedi

  • Affiliations:
  • University of California, San Diego, CA;University of California, San Diego, CA;University of California, San Diego, CA

  • Venue:
  • Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a real-time vision-based system that discriminates hand gestures performed by in-vehicle front-row seat occupants for accessing the infotainment system. The hand gesture-based visual user interface may be more natural and intuitive to the user than the current tactile interaction interface. Consequently, it may encourage a gaze-free interaction, which can alleviate driver distraction without limiting the user's infotainment experience. The system uses visible and depth images of the dashboard and center-console area in the vehicle. The first step in the algorithm uses the representation of the image area given by a modified histogram-of-oriented-gradients descriptor and a support vector machine (SVM) to classify whether the driver, passenger, or no one is interacting with the region of interest. The second step extracts gesture characteristics from temporal dynamics of the features derived in the initial step, which are then inputted to a SVM in order to perform gesture classification from a set of six classes of hand gestures. The rate of correct user classification into one of the three classes is 97.9% on average. Average hand gesture classification rates for the driver and passenger using color and depth input are above 94%. These rates were achieved on in-vehicle collected data over varying illumination conditions and human subjects. This approach demonstrates the feasibility of the hand gesture-based in-vehicle visual user interface.