An eye-hand data fusion framework for pervasive sensing of surgical activities

  • Authors:
  • S. Thiemjarus;A. James;G. -Z. Yang

  • Affiliations:
  • School of Information and Computer Technology, Sirindhorn International Institute of Technology, Thammasat University, Pathum Thani, Thailand;Department of Biosurgery and Surgical Technology, Imperial College London, UK;The Hamlyn Centre for Robotic Surgery, Imperial College London, UK

  • Venue:
  • Pattern Recognition
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper describes a generic framework for activity recognition based on temporal signals acquired from multiple input modalities and demonstrates its use for eye-hand data fusion. As a part of the data fusion framework, we present a multi-objective Bayesian Framework for Feature Selection with a pruned-tree search algorithm for finding the optimal feature set(s) in a computationally efficient manner. Experiments on endoscopic surgical episode recognition are used to investigate the potential of using eye-tracking for pervasive monitoring of surgical operation and to demonstrate how additional information induced by hand motion can further enhance the recognition accuracy. With the proposed multi-objective BFFS algorithm, suitable feature sets both in terms of feature relevancy and redundancy can be identified with a minimal number of instruments being tracked.