Merging virtual objects with the real world: seeing ultrasound imagery within the patient
SIGGRAPH '92 Proceedings of the 19th annual conference on Computer graphics and interactive techniques
Technologies for augmented reality systems: realizing ultrasound-guided needle biopsies
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
An Image Overlay System with Enhanced Reality for Percutaneous Therapy Performed Inside CT Scanner
MICCAI '02 Proceedings of the 5th International Conference on Medical Image Computing and Computer-Assisted Intervention-Part II
Towards Performing Ultrasound-Guided Needle Biopsies from within a Head-Mounted Display
VBC '96 Proceedings of the 4th International Conference on Visualization in Biomedical Computing
Augmented Reality Visualization of Ultrasound Images: System Description, Calibration, and Features
ISAR '01 Proceedings of the IEEE and ACM International Symposium on Augmented Reality (ISAR'01)
Psychophysical Evaluation of In-Situ Ultrasound Visualization
IEEE Transactions on Visualization and Computer Graphics
Multiple Motor Learning Experiences Enhance Motor Adaptability
Journal of Cognitive Neuroscience
The effects of the size and weight of a mobile device on an educational game
Computers & Education
Hi-index | 0.00 |
The present study examined the impact of augmented-reality visualization, in comparison to conventional ultrasound (CUS), on the learning of ultrasound-guided needle insertion. Whereas CUS requires cognitive processes for localizing targets, our augmented-reality device, called the “sonic flashlight” (SF) enables direct perceptual guidance. Participants guided a needle to an ultrasound-localized target within opaque fluid. In three experiments, the SF showed higher accuracy and lower variability in aiming and endpoint placements than did CUS. The SF, but not CUS, readily transferred to new targets and starting points for action. These effects were evident in visually guided action (needle and target continuously visible) and visually directed action (target alone visible). The results have application to learning to visualize surgical targets through ultrasound.