Automatic recognition of object size and shape via user-dependent measurements of the grasping hand

  • Authors:
  • Radu-Daniel Vatavu;Ionu Alexandru Zaii

  • Affiliations:
  • University Stefan cel Mare of Suceava, str. Universitatii nr. 13, 720229 Suceava, Romania;University Stefan cel Mare of Suceava, str. Universitatii nr. 13, 720229 Suceava, Romania

  • Venue:
  • International Journal of Human-Computer Studies
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

An investigation is conducted on the feasibility of using the posture of the hand during prehension in order to identify geometric properties of grasped objects such as size and shape. A recent study of Paulson et al. (2011) already demonstrated the successful use of hand posture for discriminating between several actions in an office setting. Inspired by their approach and following closely the results in motor planning and control from psychology (Makenzie and Iberall, 1994), we adopt a more cautious and punctilious approach in order to understand the opportunities that hand posture brings for recognizing properties of target objects. We present results from an experiment designed in order to investigate recognition of object properties during grasping in two different conditions: object translation (involving firm grasps) and object exploration (which includes a large variety of different hand and finger configurations). We show that object size and shape can be recognized with up to 98% accuracy during translation and up to 95% and 91% accuracies during exploration by employing user-dependent training. In contrast, experiments show less accuracy (up to 60%) for user-independent training for all tested classification techniques. We also point out the variability of individual grasping postures resulted during object exploration and the need for using classifiers trained with a large set of examples. The results of this work can benefit psychologists and researchers interested in human studies and motor control by providing more insights on grasping measurements, pattern recognition practitioners by reporting recognition results of new algorithms, as well as designers of interactive systems that work on gesture-based interfaces by providing them with design guidelines issued from our experiment.