Laparoscopic Tool Tracking Method for Augmented Reality Surgical Applications

  • Authors:
  • Alicia M. Cano;Francisco Gayá;Pablo Lamata;Patricia Sánchez-González;Enrique J. Gómez

  • Affiliations:
  • Grupo de Bioingeniería y Telemedicina (GBT), ETSIT, Universidad Politécnica de Madrid, Madrid, Spain 28040;Grupo de Bioingeniería y Telemedicina (GBT), ETSIT, Universidad Politécnica de Madrid, Madrid, Spain 28040;Grupo de Bioingeniería y Telemedicina (GBT), ETSIT, Universidad Politécnica de Madrid, Madrid, Spain 28040;Grupo de Bioingeniería y Telemedicina (GBT), ETSIT, Universidad Politécnica de Madrid, Madrid, Spain 28040;Grupo de Bioingeniería y Telemedicina (GBT), ETSIT, Universidad Politécnica de Madrid, Madrid, Spain 28040

  • Venue:
  • ISBMS '08 Proceedings of the 4th international symposium on Biomedical Simulation
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Vision-based tracking of laparoscopic tools offers new possibilities for improving surgical training and for developing new augmented reality surgical applications. We present an original method to determine not only the tip position, but also the orientation of a laparoscopic tool respect to the camera coordinate frame. A simple mathematical formulation shows how segmented tool edges and camera field of view define the tool 3D orientation. Then, 3D position of the tool tip is determined by image 2D coordinates of any known point of the tool and by tool's diameter. Accuracy is evaluated in real image sequences with known ground truth. Results show a positioning error of 9,28 mmRMS, what is explained by inaccuracies in the estimation of tool edges. The main advantage of proposed method is its robustness to occlusions of the tool tip.