Activity recognition using an egocentric perspective of everyday objects

  • Authors:
  • Dipak Surie;Thomas Pederson;Fabien Lagriffoul;Lars-Erik Janlert;Daniel Sjölie

  • Affiliations:
  • Department of Computing Science, Umeå University, Umeå, Sweden;Department of Computing Science, Umeå University, Umeå, Sweden;Department of Computing Science, Umeå University, Umeå, Sweden;Department of Computing Science, Umeå University, Umeå, Sweden;VRlab / HPC2N, Umeå University, Umeå, Sweden

  • Venue:
  • UIC'07 Proceedings of the 4th international conference on Ubiquitous Intelligence and Computing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents an activity recognition approach based on the tracking of a specific human actor's current object manipulation actions, complemented by two kinds of situational information: 1) the set of objects that are visually observable (inside the "observable space") and 2) technically graspable (inside the "manipulable space"). This "egocentric" model is inspired by situated action theory and offers the advantage of not depending on technology for absolute positioning of neither the human nor the objects. Applied in an immersive Virtual Reality environment, the proposed activity recognition approach shows a recognition precision of 89% on the activity-level and 76% on the action-level among 10 everyday home activities.