Exploiting eye-hand coordination to detect grasping movements

  • Authors:
  • Miguel Carrasco;Xavier Clady

  • Affiliations:
  • Escuela de Informática y Telecomunicaciones, Facultad de Ingeniería, Universidad Diego Portales, Vergara 432, Santiago, Chile;Vision Institute, University Pierre and Marie Curie-UPMC, INSERM UMR S968, CNRS UMR7222, Paris, France

  • Venue:
  • Image and Vision Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human beings are very skillful at reaching for and grasping objects under multiple conditions, even when faced with an object's wide variety of positions, locations, structures and orientations. This natural ability, controlled by the human brain, is called eye-hand coordination. To understand this behavior it is necessary to study both eye and hand movements simultaneously. This paper proposes a novel approach to detect grasping movements by means of computer vision techniques. This solution fuses two viewpoints, one viewpoint which is obtained from an eye-tracker capturing the user's perspective and a second viewpoint which is captured by a wearable camera attached to a user's wrist. Utilizing information from these two viewpoints it is possible to characterize multiple hand movements in conjunction with eye-gaze movements through a Hidden-Markov Model framework. This paper shows that combining these two sources makes it possible to detect hand gestures using only the objects contained in the scene even without markers on the surface of the objects. In addition, it is possible to detect which is the desired object before the user can actually grasp said object.