Using Interaction Signatures to Find and Label Chairs and Floors

  • Authors:
  • Patrick Peursum;Svetha Venkatesh;Geoff A. W. West;Hung Hai Bui

  • Affiliations:
  • Curtin University of Technology;Curtin University of Technology;Curtin University of Technology;SRI International

  • Venue:
  • IEEE Pervasive Computing
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

The human-object interaction signatures approach to object recognition proposes to find and classify objects in a scene by referring solely to related human actions. This method specifically addresses the problems and opportunities encountered in the typical smart-home monitoring system: wide-angle views of cluttered scenes with frequent, repeated human activity. Traditional shape-based object recognition tends to fail under these conditions owing to the unconstrained variety of object shapes, target objects' low resolution, and the partial occlusion of target objects by other scene objects. In this new approach, the system labels objects using evidence accumulated over time and multiple instances of human-object interactions. Furthermore, it uses partial occlusions of the person by an object to refine the object label's position. Preliminary experiments with this approach have investigated interaction signatures associated with walking and sitting on a chair, and then used the detected signatures to label a sceneýs chairs and navigable floor space.