Introduction to artificial neural systems
Introduction to artificial neural systems
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Blobworld: Image Segmentation Using Expectation-Maximization and Its Application to Image Querying
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical color models with application to skin detection
International Journal of Computer Vision
Analyzing Articulated Motion Using Expectation-Maximization
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Silhouette-Based Human Identification from Body Shape and Gait
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Estimation of Articulated Motion Using Kinematically Constrained Mixture Densities
NAM '97 Proceedings of the 1997 IEEE Workshop on Motion of Non-Rigid and Articulated Objects (NAM '97)
Hebbian learning with winner take all for spiking neural networks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Hi-index | 0.00 |
Weight carrying conditions in human subjects is classified using a mixture of Gaussian based classifier and neural networks. Expectation maximization (EM) algorithm is used to learn the model parameters for the mixture of Gaussian (MOG) based classifier. Since variables change in time as the human subjects walk, and likelihood score in a mixture model is usually calculated for stationary data, a scoring system is developed to calculate how likely a time sequence of variables belongs to a particular distribution. A neural network (NN) based classifier is also developed which uses the traditional back-propagation algorithm for training. The results obtained show overall 74.1% accuracy using MOG and 66.4% using NN for the test set in a binary classification task of detecting "load" or "no-load" conditions using just two variables. The lower accuracy using NN is not surprising as averaged variables are provided as inputs to the NN while MOG is able to use the dynamic information. In another classification task with four classes and using two variables, the accuracy was 37.2% using MOG and 34.2% using NN on the test set which are both better than chance. Accuracy of NN using 7 variables was 81.4% for binary classification and 41.3% for four-class classification. An interesting finding from these results is that NN (using 7 averaged variables) performed better than human perceivers who were asked to judge based on stick figure animations of subjects walking. Another interesting revelation from this study was that the covariance matrices from the MOG model revealed "anti-phase" locking of the elbow and stoop angle as the subjects walk.