Using a smart phone for information rendering in computer-aided surgery

  • Authors:
  • Gaël Le Bellego;Marek Bucki;Ivan Bricault

  • Affiliations:
  • UJF-Grenoble 1/CNRS/TIMC-IMAG UMR 5525, Grenoble, France;UJF-Grenoble 1/CNRS/TIMC-IMAG UMR 5525, Grenoble, France and CHU Grenoble, Radiology and imaging Department, Grenoble, France;UJF-Grenoble 1/CNRS/TIMC-IMAG UMR 5525, Grenoble, France

  • Venue:
  • HCII'11 Proceedings of the 14th international conference on Human-computer interaction: users and applications - Volume Part IV
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Computer-aided surgery intensively uses the concept of navigation: after having collected CT data from a patient and transferred them to the operating room coordinate system, the surgical instrument (a puncture needle for instance) is localized and its position is visualized with respect to the patient organs which are not directly visible. This approach is very similar to the GPS paradigm. Traditionally, three orthogonal slices in the patient data are presented on a distant screen. Sometimes a 3D representation is also added. In this study we evaluated the potential of adding a smart phone as a man-machine interaction device. Different experiments involving operators puncturing a phantom are reported in this paper.