Comparison of human and machine recognition of everyday human actions

  • Authors:
  • Trevor D. Jones;Shaun W. Lawson;David Benyon;Alistair Armitage

  • Affiliations:
  • Lincoln Social Computing Research Centre, Department of Computing and Informatics, University of Lincoln, Brayford Pool, Lincoln, UK;Lincoln Social Computing Research Centre, Department of Computing and Informatics, University of Lincoln, Brayford Pool, Lincoln, UK;School of Computing, Napier University, Edinburgh, UK;School of Computing, Napier University, Edinburgh, UK

  • Venue:
  • ICDHM'07 Proceedings of the 1st international conference on Digital human modeling
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

The research presented here makes a contribution to the understanding of the recognition of biological motion by comparing human recognition of a set of everyday gestures and motions with machine interpretation of the same dataset. Our reasoning is that analysis of any differences and/or correlations between the two could reveal insights into how humans themselves perceive motion and hint at the most important cues that artificial classifiers should be using to perform such a task. We captured biological motion data from human participants engaged in a number of everyday activities, such as walking, running and waving, and then built two artificial classifiers (a Finite State Machine and a multi-layer perceptron artificial neural network, ANN) which were capable of discriminating between activities. We then compared the accuracy of these classifiers with the abilities of a group of human observers to interpret the same activities when they were presented as moving light displays (MLDs). Our results suggest that machine recognition with ANNs is not only comparable to human levels of recognition but can exceed it in some instances.