Analysis of Rotational Robustness of Hand Detection with a Viola-Jones Detector
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 3 - Volume 03
Histograms of Oriented Gradients for Human Detection
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Assessing subjective response to haptic feedback in automotive touchscreens
Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications
A novel active heads-up display for driver assistance
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on human computing
Vision-based infotainment user determination by hand recognition for driver assistance
IEEE Transactions on Intelligent Transportation Systems
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Visual cues supporting direct touch gesture interaction with in-vehicle information systems
Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Looking-In and Looking-Out of a Vehicle: Computer-Vision-Based Enhanced Vehicle Safety
IEEE Transactions on Intelligent Transportation Systems
Natural, intuitive finger based input as substitution for traditional vehicle control
Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Standardization of the in-car gesture interaction space
Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Hi-index | 0.00 |
We present a real-time vision-based system that discriminates hand gestures performed by in-vehicle front-row seat occupants for accessing the infotainment system. The hand gesture-based visual user interface may be more natural and intuitive to the user than the current tactile interaction interface. Consequently, it may encourage a gaze-free interaction, which can alleviate driver distraction without limiting the user's infotainment experience. The system uses visible and depth images of the dashboard and center-console area in the vehicle. The first step in the algorithm uses the representation of the image area given by a modified histogram-of-oriented-gradients descriptor and a support vector machine (SVM) to classify whether the driver, passenger, or no one is interacting with the region of interest. The second step extracts gesture characteristics from temporal dynamics of the features derived in the initial step, which are then inputted to a SVM in order to perform gesture classification from a set of six classes of hand gestures. The rate of correct user classification into one of the three classes is 97.9% on average. Average hand gesture classification rates for the driver and passenger using color and depth input are above 94%. These rates were achieved on in-vehicle collected data over varying illumination conditions and human subjects. This approach demonstrates the feasibility of the hand gesture-based in-vehicle visual user interface.