Moving object recognition in eigenspace representation: gait analysis and lip reading
Pattern Recognition Letters
A Multi-view Method for Gait Recognition Using Static Body Parameters
AVBPA '01 Proceedings of the Third International Conference on Audio- and Video-Based Biometric Person Authentication
Frame difference energy image for gait recognition with incomplete silhouettes
Pattern Recognition Letters
Unsupervised clustering of people from 'skeleton' data
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Region covariance: a fast descriptor for detection and classification
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part II
A shape- and texture-based enhanced Fisher classifier for face recognition
IEEE Transactions on Image Processing
Gait energy volumes and frontal gait recognition using depth images
IJCB '11 Proceedings of the 2011 International Joint Conference on Biometrics
UbiHeld: ubiquitous healthcare monitoring system for elderly and chronic patients
Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication
Hi-index | 0.00 |
Gait is an important biometric modality for recognizing humans. Unlike other biometrics, human gait can be captured at a distance which makes it an unobtrusive method for recognition. In this paper, an unrestricted gait recognition algorithm is proposed which uses 3D skeleton information and trajectory covariance of joint points. 3-D skeleton is generated from the depth images that are captured using Kinect sensor. The temporal tracking of skeleton points is used for gait analysis. The covariance measure between these skeleton point trajectories are computed and the covariance matrices form the gait model. The gait is recognized by computing the minimum dissimilarity measure between the gait models of the training data and the testing data. Recognition accuracy of over 90% has been achieved for a data set consisting of fixed and moving camera scenarios of 20 subjects.