Merging virtual objects with the real world: seeing ultrasound imagery within the patient
SIGGRAPH '92 Proceedings of the 19th annual conference on Computer graphics and interactive techniques
Stereo Augmented Reality in the Surgical Microscope
Presence: Teleoperators and Virtual Environments
Evaluation of a tool-mounted guidance display for computer-assisted surgery
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
HeartPad: real-time visual guidance for cardiac ultrasound
Proceedings of the Workshop at SIGGRAPH Asia
Hi-index | 0.00 |
Computer-aided surgery intensively uses the concept of navigation: after having collected CT data from a patient and transferred them to the operating room coordinate system, the surgical instrument (a puncture needle for instance) is localized and its position is visualized with respect to the patient organs which are not directly visible. This approach is very similar to the GPS paradigm. Traditionally, three orthogonal slices in the patient data are presented on a distant screen. Sometimes a 3D representation is also added. In this study we evaluated the potential of adding a smart phone as a man-machine interaction device. Different experiments involving operators puncturing a phantom are reported in this paper.