Merging virtual objects with the real world: seeing ultrasound imagery within the patient
SIGGRAPH '92 Proceedings of the 19th annual conference on Computer graphics and interactive techniques
Differential algorithm for the determination of shape from shading using a point light source
Image and Vision Computing
Computer Vision and Image Understanding
MICCAI '02 Proceedings of the 5th International Conference on Medical Image Computing and Computer-Assisted Intervention-Part II
Egocentric Depth Judgments in Optical, See-Through Augmented Reality
IEEE Transactions on Visualization and Computer Graphics
Depth perception: a major issue in medical AR: evaluation study by twenty surgeons
MICCAI'06 Proceedings of the 9th international conference on Medical Image Computing and Computer-Assisted Intervention - Volume Part I
Perceptual Docking for Robotic Control
MIAR '08 Proceedings of the 4th international workshop on Medical Imaging and Augmented Reality
Gaze-Contingent 3D Control for Focused Energy Ablation in Robotic Assisted Surgery
MICCAI '08 Proceedings of the 11th International Conference on Medical Image Computing and Computer-Assisted Intervention, Part II
Fused video and ultrasound images for minimally invasive partial nephrectomy: a phantom study
MICCAI'10 Proceedings of the 13th international conference on Medical image computing and computer-assisted intervention: Part III
Motion compensated SLAM for image guided surgery
MICCAI'10 Proceedings of the 13th international conference on Medical image computing and computer-assisted intervention: Part II
Dynamic guidance for robotic surgery using image- constrained biomechanical models
MICCAI'10 Proceedings of the 13th international conference on Medical image computing and computer-assisted intervention: Part I
Hi-index | 0.00 |
The increasing use of robotic assisted minimally invasive surgery (MIS) provides an ideal environment for using Augmented Reality (AR) for performing image guided surgery. Seamless synthesis of AR depends on a number of factors relating to the way in which virtual objects appear and visually interact with a real environment. Traditional overlaid AR approaches generally suffer from a loss of depth perception. This paper presents a new AR method for robotic assisted MIS, which uses a novel pq-space based non-photorealistic rendering technique for providing see-through vision of the embedded virtual object whilst maintaining salient anatomical details of the exposed anatomical surface. Experimental results with both phantom and in vivo lung lobectomy data demonstrate the visual realism achieved for the proposed method and its accuracy in providing high fidelity AR depth perception.