Sparse hidden markov models for surgical gesture classification and skill evaluation

  • Authors:
  • Lingling Tao;Ehsan Elhamifar;Sanjeev Khudanpur;Gregory D. Hager;René Vidal

  • Affiliations:
  • CS, Johns Hopkins University, Baltimore, MD;CS, Johns Hopkins University, Baltimore, MD;CS, Johns Hopkins University, Baltimore, MD;ECE Dept., Johns Hopkins University, Baltimore, MD;BME, Johns Hopkins University, Baltimore, MD

  • Venue:
  • IPCAI'12 Proceedings of the Third international conference on Information Processing in Computer-Assisted Interventions
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the problem of classifying surgical gestures and skill level in robotic surgical tasks. Prior work in this area models gestures as states of a hidden Markov model (HMM) whose observations are discrete, Gaussian or factor analyzed. While successful, these approaches are limited in expressive power due to the use of discrete or Gaussian observations. In this paper, we propose a new model called sparse HMMs whose observations are sparse linear combinations of elements from a dictionary of basic surgical motions. Given motion data from many surgeons with different skill levels, we propose an algorithm for learning a dictionary for each gesture together with an HMM grammar describing the transitions among different gestures. We then use these dictionaries and the grammar to represent and classify new motion data. Experiments on a database of surgical motions acquired with the da Vinci system show that our method performs on par with or better than state-of-the-art methods.This suggests that learning a grammar based on sparse motion dictionaries is important in gesture and skill classification.