Automatic Identification of Environment Haptic Properties

  • Authors:
  • Pierre E. Dupont;Capt. Timothy M. Schulteis;Paul A. Millman;Robert D. Howe

  • Affiliations:
  • Aerospace and Mechanical Engineering, Boston University, Boston, MA 02215, pierre@bu.edu;San Antonio Air Logistics Center Kelly AFB, TX 78241;Computer Motion, Inc. Goleta, CA 93117;Division of Applied Science, Harvard University, Cambridge, MA 02138

  • Venue:
  • Presence: Teleoperators and Virtual Environments
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many applications can be imagined for a system that processes sensory information collected during telemanipulation tasks in order to automatically identify properties of the remote environment. These applications include generating model-based simulations for training operators in critical procedures and improving real-time performance in unstructured environments or when time delays are large. This paper explores the research issues involved in developing such an identification system, focusing on properties that can be identified from remote manipulator motion and force data. As a case study, a simple block-stacking task, performed with a teleoperated two-fingered planar hand, is considered. An algorithm is presented that automatically segments the data collected during the task, given only a general description of the temporal sequence of task events. Using the segmented data, the algorithm then successfully estimates the weight, width, height, and coefficient of friction of the two blocks handled during the task. This data is used to calibrate a virtual model incorporating visual and haptic feedback. This case study highlights the broader research issues that must be addressed in automatic property identification.