Interpreting dynamic meanings by integrating gesture and posture recognition system

  • Authors:
  • Omer Rashid Ahmed;Ayoub Al-Hamadi;Bernd Michaelis

  • Affiliations:
  • Institute for Electronics, Signal Processing and Communications, Otto-von-Guericke-University Magdeburg, Germany;Institute for Electronics, Signal Processing and Communications, Otto-von-Guericke-University Magdeburg, Germany;Institute for Electronics, Signal Processing and Communications, Otto-von-Guericke-University Magdeburg, Germany

  • Venue:
  • ACCV'10 Proceedings of the 2010 international conference on Computer vision - Volume Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Integration of information from different systems support enhanced functionality however it requires a rigorous pre-determined results for the fusion. This paper proposes a novel approach for determining the integration criteria using Particle filter for the fusion of hand gesture and posture recognition system at decision level. For decision level fusion, integration framework requires the classification of hand gesture and posture symbols in which HMM is used to classify the alphabets and numbers from hand gesture recognition system whereas ASL finger spelling signs (alphabets and numbers) are classified by posture recognition system using SVM. These classification results are input to integration framework to compute the contribution-weights. For this purpose, Condensation algorithm approximates the optimal a-posterior probability using a-prior probability and Gaussian based likelihood function thus making the weights independent of classification ambiguities. Considering the recognition as a problem of regular grammar, we have developed our production rules based on context free grammar (CFG) for the restaurant scenario. On the basis of contribution-weights, we mapped the recognized outcome over CFG rules and infer meaningful expressions. Experiments are conducted on 500 different combinations of restaurant orders with the overall 98.3% inference accuracy which proves the significance of proposed approach.