Active Modeling of Articulated Objects with Haptic Vision

  • Authors:
  • Masatsugu Uejo;Hiromi T. Tanaka

  • Affiliations:
  • Ritsumeikan University, Japan;Ritsumeikan University, Japan

  • Venue:
  • ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, there are growing needs for haptic exploration to estimate and extract physical object properties such as mass, friction, elasticity, function etc. In this paper, we propose a novel approach to active modeling of articulated objects with Haptic Vision. The method automatically extracts and describes both geometrical and physical properties of an articulated object, through the observation of interactions with active vision and "active touch" by a robot hand, using a CCD camera, range and force-feedback sensors. Such models can provide users with reality-based interactions with the objects in virtual environments, to test and extract physical properties such as functions, parts motions and linking structures etc. Experimental results on a paper punch and a pair of pliers were shown and these results were successfully used to construct a reality-based virtual environment simulator.