Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
The visual analysis of human movement: a survey
Computer Vision and Image Understanding
Human motion analysis: a review
Computer Vision and Image Understanding
Looking at People: Sensing for Ubiquitous and Wearable Computing
IEEE Transactions on Pattern Analysis and Machine Intelligence
A survey of computer vision-based human motion capture
Computer Vision and Image Understanding - Modeling people toward vision-based underatanding of a person's shape, appearance, and movement
Dynamic bayesian networks for information fusion with applications to human-computer interfaces
Dynamic bayesian networks for information fusion with applications to human-computer interfaces
Synthesis and acquisition of laban movement analysis qualitative parameters for communicative gestures
Acquiring and validating motion qualities from live limb gestures
Graphical Models
A survey of advances in vision-based human motion capture and analysis
Computer Vision and Image Understanding - Special issue on modeling people: Vision-based understanding of a person's shape, appearance, movement, and behaviour
Visual based human motion analysis: mapping gestures using a Puppet model
EPIA'05 Proceedings of the 12th Portuguese conference on Progress in Artificial Intelligence
Gesture recognition using a marionette model and dynamic bayesian networks (DBNs)
ICIAR'06 Proceedings of the Third international conference on Image Analysis and Recognition - Volume Part II
Hi-index | 0.00 |
We present as a contribution to the field of human-machine interaction a system that analyzes human movements online, based on the concept of Laban Movement Analysis (LMA). The implementation uses a Bayesian model for learning and classification, while the results are presented for the application to gesture recognition. Nowadays technology offers an incredible number of applications to be used in human-machine interaction. Still, it is difficult to find implemented cognitive processes that benefit from those possibilities. Future approaches must offer to the user an effortless and intuitive way of interaction. We present the Laban Movement Analysis as a concept to identify useful features of human movements to classify human actions. The movements are extracted using both, vision and magnetic tracker. The descriptor opens possibilities towards expressiveness and emotional content. To solve the problem of classification we use the Bayesian framework as it offers an intuitive approach to learning and classification. It also provides the possibility to anticipate the performed action given the observed features. We present results of our system through its embodiment in the social robot 'Nicole' in the context of a person performing gestures and 'Nicole' reacting by means of audio output and robot movement.