Design and validation of an image-guided robot for small animal research

  • Authors:
  • Peter Kazanzides;Jenghwa Chang;Iulian Iordachita;Jack Li;C. Clifton Ling;Gabor Fichtinger

  • Affiliations:
  • Department of Computer Science, Johns Hopkins University;Medical Physics Department, Memorial Sloan Kettering Cancer Center;Department of Computer Science, Johns Hopkins University;Department of Mechanical Engineering, Johns Hopkins University;Medical Physics Department, Memorial Sloan Kettering Cancer Center;Department of Computer Science, Johns Hopkins University

  • Venue:
  • MICCAI'06 Proceedings of the 9th international conference on Medical Image Computing and Computer-Assisted Intervention - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We developed an image-guided robot system to achieve highly accurate placement of thin needles and probes into in-vivo rodent tumor tissue in a predefined pattern that is specified on a preoperative image. This system can be used for many experimental procedures where the goal is to correlate a set of physical measurements with a corresponding set of image intensities or, more generally, to perform a physical action at a set of anatomic points identified on a preoperative image. This paper focuses on the design and validation of the robot system, where the first application is to insert oxygen measurement probes in a three-dimensional (3D) grid pattern defined with respect to a PET scan of a tumor. The design is compatible with CT and MRI, which we plan to use to identify targets for biopsy and for the injection of adenoviral sequences for gene therapy. The validation is performed using a phantom and includes a new method for estimating the Fiducial Localization Error (FLE) based on the measured Fiducial Distance Error (FDE).