Learning a 3D Human Pose Distance Metric from Geometric Pose Descriptor

  • Authors:
  • Cheng Chen;Yueting Zhuang;Feiping Nie;Yi Yang;Fei Wu;Jun Xiao

  • Affiliations:
  • Idiap Research Institute, Martigny;Zhejiang University, Hangzhou;University of Texas at Arlington, Arlington;ITEE, The University of Queensland, Brisbane;Zhejiang University, Hangzhou;Zhejiang University, Hangzhou

  • Venue:
  • IEEE Transactions on Visualization and Computer Graphics
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Estimating 3D pose similarity is a fundamental problem on 3D motion data. Most previous work calculates L2-like distance of joint orientations or coordinates, which does not sufficiently reflect the pose similarity of human perception. In this paper, we present a new pose distance metric. First, we propose a new rich pose feature set called Geometric Pose Descriptor (GPD). GPD is more effective in encoding pose similarity by utilizing features on geometric relations among body parts, as well as temporal information such as velocities and accelerations. Based on GPD, we propose a semisupervised distance metric learning algorithm called Regularized Distance Metric Learning with Sparse Representation (RDSR), which integrates information from both unsupervised data relationship and labels. We apply the proposed pose distance metric to applications of motion transition decision and content-based pose retrieval. Quantitative evaluations demonstrate that our method achieves better results with only a small amount of human labels, showing that the proposed pose distance metric is a promising building block for various 3D-motion related applications.