Object interaction detection using hand posture cues in an office setting

  • Authors:
  • Brandon Paulson;Danielle Cummings;Tracy Hammond

  • Affiliations:
  • TAMU Sketch Recognition Lab, 3112 TAMU, College Station, TX 77843, USA;TAMU Sketch Recognition Lab, 3112 TAMU, College Station, TX 77843, USA;TAMU Sketch Recognition Lab, 3112 TAMU, College Station, TX 77843, USA

  • Venue:
  • International Journal of Human-Computer Studies
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Activity recognition plays a key role in providing information for context-aware applications. When attempting to model activities, some researchers have looked towards Activity Theory, which theorizes that activities have objectives and are accomplished through interactions with tools and objects. The goal of this paper is to determine if hand posture can be used as a cue to determine the types of interactions a user has with objects in a desk/office environment. Furthermore, we wish to determine if hand posture is user-independent across all users when interacting with the same objects in a natural manner. Our experiments indicate that (a) hand posture can be used to determine object interaction, with accuracy rates around 97%, and (b) hand posture is dependent upon the individual user when users are allowed to interact with objects as they would naturally.