Spatial Motion Constraints Using Virtual Fixtures Generated by Anatomy

  • Authors:
  • M. Li;M. Ishii;R. H. Taylor

  • Affiliations:
  • Nat. Inst. of Health, Bethesda, MD;-;-

  • Venue:
  • IEEE Transactions on Robotics
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a spatial-motion-constraints-generation approach for a human-machine collaborative surgical-assistant system from registered computer tomography models. We extend constrained optimization formulation incorporating task goals, anatomy-based constraints, "no fly zones," etc. We use a fast potential-collision-constraint-detection method based on a 3-D surface model and covariance tree data structure. These boundary constraints, along with task behaviors and joint limits, serve as constraint conditions for constrained robot control. We are able to follow a complex path inside a human skull, phantom represented by a surface model composed of 99 000 vertices and 182 000 triangles in real time. Our approach enables real-time task-based control of a surgical robot in a precise interactive minimally invasive surgery task. We illustrate our approach based on two example tasks which are analogous to the procedures in endoscopic sinus surgery, and analyze the user's performance on both teleoperation and cooperative control for one of the example tasks. The experimental results show that a robotic assistant employing our approach on spatial motion constraints can assist the user in skilled manipulation tasks, while maintaining desired properties. Our approach is equally applicable to teleoperative and cooperative controlled robots