An adaptive solution for intra-operative gesture-based human-machine interaction

  • Authors:
  • Ali Bigdelou;Loren Schwarz;Nassir Navab

  • Affiliations:
  • Technische Universität München, München, Germany;Technische Universität München, München, Germany;Technische Universität München, München, Germany

  • Venue:
  • Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Computerized medical systems play a vital role in the operating room, however, sterility requirements and interventional workflow often make interaction with these devices challenging for surgeons. Typical solutions, such as delegating physical control of keyboard and mouse to assistants, add an undesirable level of indirection. We present a touchless, gesture-based interaction framework for the operating room that lets surgeons define a personalized set of gestures for controlling arbitrary medical computerized systems. Instead of using cameras for capturing gestures, we rely on a few wireless inertial sensors, placed on the arms of the surgeon, eliminating the dependence on illumination and line-of-sight. A discriminative gesture recognition approach based on kernel regression allows us to simultaneously classify performed gestures and to track the relative spatial pose within each gesture, giving surgeons fine-grained control of continuous parameters. An extensible software architecture enables a dynamic association of learned gestures to arbitrary intraoperative computerized systems. Our experiments illustrate the performance of our approach and encourage its practical applicability.