Transfer discriminant-analysis of canonical correlations for view-transfer action recognition

  • Authors:
  • Xinxiao Wu;Cuiwei Liu;Yunde Jia

  • Affiliations:
  • Beijing Laboratory of Intelligent Information Technology, School of Computer Science, Beijing Institute of Technology, Beijing, China;Beijing Laboratory of Intelligent Information Technology, School of Computer Science, Beijing Institute of Technology, Beijing, China;Beijing Laboratory of Intelligent Information Technology, School of Computer Science, Beijing Institute of Technology, Beijing, China

  • Venue:
  • PCM'12 Proceedings of the 13th Pacific-Rim conference on Advances in Multimedia Information Processing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

A novel transfer learning approach, referred to as Transfer Discriminant-Analysis of Canonical Correlations (Transfer DCC), is proposed to recognize human actions from one view (target view) via the discriminative model learned from another view (source view). To cope with the considerable change between feature distributions of source view and target view, Transfer DCC includes an effective nonparametric criterion in the discriminative function to minimize the mismatch between data distributions of these two views. We utilize the canonical correlation between the means of samples from source view and target view to measure the data distribution distance between the two views. Consequently, Transfer DCC learns an optimal projection matrix by simultaneously maximizing the canonical correlation of mean samples from source view and target view, maximizing the canonical correlations of within-class samples and minimizing the canonical correlations of between-class samples. Moreover, we propose a Weighted Canonical Correlations scheme to fuse the multi-class canonical correlations from multiple source views according to their corresponding weights for recognition in the target view. Experiments on the IXMAS multi-view dataset demonstrate the effectiveness of our method.