Case Retrieval Nets: Basic Ideas and Extensions
KI '96 Proceedings of the 20th Annual German Conference on Artificial Intelligence: Advances in Artificial Intelligence
Action- and Workflow-Driven Augmented Reality for Computer-Aided Medical Procedures
IEEE Computer Graphics and Applications
On scene interpretation with description logics
Image and Vision Computing
Data Warehousing Technology for Surgical Workflow Analysis
CBMS '08 Proceedings of the 2008 21st IEEE International Symposium on Computer-Based Medical Systems
Modeling and Online Recognition of Surgical Phases Using Hidden Markov Models
MICCAI '08 Proceedings of the 11th International Conference on Medical Image Computing and Computer-Assisted Intervention, Part II
Eye-gaze driven surgical workflow segmentation
MICCAI'07 Proceedings of the 10th international conference on Medical image computing and computer-assisted intervention
Towards Cognitive Medical Robotics in Minimal Invasive Surgery
Proceedings of Conference on Advances In Robotics
Hi-index | 0.00 |
The objective of this research is to develop and evaluate a context-aware Augmented Reality system which filters content based on the local context of the surgical instrument. We optically track positions of the patient and the instrument and interpret this data to recognize the phase of the operation. Depending on the result, an appropriate visualization is generated and displayed. For the interpretation, we combine a rule-based, deductive approach and a case-based, inductive one. Both rely on a description-logic based ontology. In phantom experiments the system was used to support implant positioning in models of the mandible. It recognized the phase correctly and provided an appropriate visualization about 85% of the time. The knowledge-based concept for intraoperative assistance proved capable of generating useful visualizations in a timely manner. However, further work is necessary to improve accuracy and reduce the deviation from the actual and planned implant positions.