Complementing visual tracking of moving targets by fusion of tactile sensing
Robotics and Autonomous Systems
Hi-index | 0.00 |
Recently, there are growing needs for haptic exploration to estimate and extract physical object properties such as mass, friction, elasticity, function etc. In this paper, we propose a novel approach to active modeling of articulated objects with Haptic Vision. The method automatically extracts and describes both geometrical and physical properties of an articulated object, through the observation of interactions with active vision and "active touch" by a robot hand, using a CCD camera, range and force-feedback sensors. Such models can provide users with reality-based interactions with the objects in virtual environments, to test and extract physical properties such as functions, parts motions and linking structures etc. Experimental results on a paper punch and a pair of pliers were shown and these results were successfully used to construct a reality-based virtual environment simulator.