Pattern-based real-time feedback for a temporal bone simulator

  • Authors:
  • Yun Zhou;James Bailey;Ioanna Ioannou;Sudanthi Wijewickrema;Stephen O'Leary;Gregor Kennedy

  • Affiliations:
  • University of Melbourne;University of Melbourne;University of Melbourne;University of Melbourne;University of Melbourne;University of Melbourne

  • Venue:
  • Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Delivering automated real-time performance feedback in simulated surgical environments is an important and challenging task. We propose a framework based on patterns to evaluate surgical performance and provide feedback during simulated ear (temporal bone) surgery in a 3D virtual environment. Temporal bone surgery is composed of a number of stages with distinct aims and surgical techniques. To provide context-appropriate feedback we must be able to identify each stage, recognise when feedback is to be provided, and determine the nature of that feedback. To achieve these aims, we train pattern-based models using data recorded by a temporal bone simulator. We create one model to predict the current stage of the procedure and separate stage-specific models to provide human-friendly feedback within each stage. We use 27 temporal bone simulation runs conducted by 7 expert ear surgeons and 6 trainees to train and evaluate our models. The results of our evaluation show that the proposed system identifies the stage of the procedure correctly and provides constructive feedback to assist surgical trainees in improving their technique.