Exploring dense trajectory feature and encoding methods for human interaction recognition

  • Authors:
  • Xiaojiang Peng;Xiao Wu;Qiang Peng;Xianbiao Qi;Yu Qiao;Yanhua Liu

  • Affiliations:
  • Southwest Jiaotong University, Chengdu, China and Shenzhen Key Lab of CVPR, Shenzhen Institute of Advanced Technology, CAS;Southwest Jiaotong University, Chengdu, China;Southwest Jiaotong University, Chengdu, China;Beijing University of Posts and Telecommunications, Beijing, China;Shenzhen Key Lab of CVPR, Shenzhen Institute of Advanced Technology, CAS;Samsung Guangzhou Mobile, R&D Center, Guangzhou, China

  • Venue:
  • Proceedings of the Fifth International Conference on Internet Multimedia Computing and Service
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, human activity recognition has obtained increasing attention due to its wide range of potential applications. Much progress has been made to improve the performance on single actions in videos while few on collective and interactive activities. Human interaction is a more challenging task owing to multi-actors in an execution. In this paper, we utilize multi-scale dense trajectories and explore four advanced feature encoding methods on the human interaction dataset with a bag-of-features framework. Particularly, dense trajectories are described by shape, histogram of gradient orientation, histogram of flow orientation and motion boundary histogram, and all these are computed by integral images. Experimental results on the UT-Interaction dataset show that our approach outperforms state-of-the-art methods by 7-14%. Additionally, we thoroughly analyse a finding that the performance of vector quantization is on par with or even better than other sophisticated feature encoding methods by using dense trajectories in videos.