Visual tracking of surgical tools for proximity detection in retinal surgery

  • Authors:
  • Rogério Richa;Marcin Balicki;Eric Meisner;Raphael Sznitman;Russell Taylor;Gregory Hager

  • Affiliations:
  • Laboratory for Computational Science and Robotics, Johns Hopkins University, Baltimore MD;Laboratory for Computational Science and Robotics, Johns Hopkins University, Baltimore MD;Laboratory for Computational Science and Robotics, Johns Hopkins University, Baltimore MD;Laboratory for Computational Science and Robotics, Johns Hopkins University, Baltimore MD;Laboratory for Computational Science and Robotics, Johns Hopkins University, Baltimore MD;Laboratory for Computational Science and Robotics, Johns Hopkins University, Baltimore MD

  • Venue:
  • IPCAI'11 Proceedings of the Second international conference on Information processing in computer-assisted interventions
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In retinal surgery, surgeons face difficulties such as indirect visualization of surgical targets, physiological tremor and lack of tactile feedback. Such difficulties increase the risks of incorrect surgical gestures which may cause retinal damage. In this context, robotic assistance has the potential to overcome current technical limitations and increase surgical safety. In this paper we present a method for robustly tracking surgical tools in retinal surgery for detecting proximity between surgical tools and the retinal surface. An image similarity function based on weighted mutual information is specially tailored for tracking under critical illumination variations, lens distortions, and rapid motion. The proposed method was tested on challenging conditions using a phantom eye and recorded human in vivo data acquired by an ophthalmic stereo microscope.