Content-based surgical workflow representation using probabilistic motion modeling

  • Authors:
  • Stamatia Giannarou;Guang-Zhong Yang

  • Affiliations:
  • Institute of Biomedical Engineering, Imperial College London, UK;Department of Computing, Imperial College London, UK

  • Venue:
  • MIAR'10 Proceedings of the 5th international conference on Medical imaging and augmented reality
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Succinct content-based representation of minimally invasive surgery (MIS) video is important for efficient surgical workflow analysis and modeling of instrument-tissue interaction. Current approaches to video representation are not well suited to MIS as they do not fully capture the underlying tissue deformation nor provide reliable feature tracking. The aim of this paper is to propose a novel framework for content-based surgical scene representation, which simultaneously identifies key surgical episodes and encodes motion of tracked salient features. The proposed method does not require pre-segmentation of the scene and can be easily combined with 3D scene reconstruction techniques to provide further scene representation without the need of going back to the raw data.