Sparse representation for recognizing object-to-object actions under occlusions

  • Authors:
  • Jun-We Hsieh;Kai-Ting Chuang;Yilin Yan;Li-Chih Chen

  • Affiliations:
  • National Taiwan Ocean University, Keelung, Taiwan;National Taiwan Ocean University, Keelung, Taiwan;National Taiwan Ocean University, Keelung, Taiwan;Yuan Ze University, Chung-Li, Taiwan

  • Venue:
  • Proceedings of the Fifth International Conference on Internet Multimedia Computing and Service
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we describe the formatting guidelines for ACM SIG Proceedings. This paper proposes a novel event classification scheme to analyze various interaction actions between persons using sparse representation. The occlusion problem and the high complexity to model complicated interactions are two major challenges in person-to-person action analysis. To address the occlusion problem, the proposed scheme represents an action sample in an over-complete dictionary whose base elements are the training samples themselves. This representation is naturally sparse and makes errors (caused by different environmental changes like lighting or occlusions) sparsely appear in the training library. Because of the sparsity, it is robust to occlusions and lighting changes. In addition, a novel Hamming distance classification (HDC) scheme is proposed to classify action events to detailed types. Because the nature of Hamming code is highly tolerant to noise, the HDC scheme is also robust to occlusions. The high complexity of complicated action modeling can be tackled by adding more examples to the over-complete dictionary. Thus, even though the interaction relations are complicated, the proposed method still works successfully to recognize them and can be easily extended to analyze action events among multiple persons. More importantly, the HDC scheme is very efficient and suitable for real-time applications because no optimization process is involved to calculate the reconstruction error.