Emulating human perception of motion similarity

  • Authors:
  • Jeff K. T. Tang;Howard Leung;Taku Komura;Hubert P. H. Shum

  • Affiliations:
  • -;(Correspd.) Department of Computer Science, City University of Hong Kong, 83 Tat Chee Ave., Kowloon, Hong Kong.;-;-

  • Venue:
  • Computer Animation and Virtual Worlds - CASA'2008 Special Issue
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Evaluating the similarity of motions is useful for motion retrieval, motion blending, and performance analysis of dancers and athletes. Euclidean distance between corresponding joints has been widely adopted in measuring similarity of postures and hence motions. However, such a measure does not necessarily conform to the human perception of motion similarity. In this paper, we propose a new similarity measure based on machine learning techniques. We make use of the results of questionnaires from subjects answering whether arbitrary pairs of motions appear similar or not. Using the relative distance between the joints as the basic features, we train the system to compute the similarity of arbitrary pair of motions. Experimental results show that our method outperforms methods based on Euclidean distance between corresponding joints. Our method is applicable to content-based motion retrieval of human motion for large-scale database systems. It is also applicable to e-Learning systems which automatically evaluates the performance of dancers and athletes by comparing the subjects' motions with those by experts. Copyright © 2008 John Wiley & Sons, Ltd.