Touch-less interaction with medical images using hand & foot gestures

  • Authors:
  • Shahram Jalaliniya;Jeremiah Smith;Miguel Sousa;Lars Büthe;Thomas Pederson

  • Affiliations:
  • IT University of Copenhagen, Copenhagen, Denmark;Imperial College of London, London, United Kingdom;Future-Shape GmbH, Höhenkirchen-Siegertsbrunn, Germany;ETH Zürich, Zürich, Switzerland;IT University of Copenhagen, Copenhagen, Denmark

  • Venue:
  • Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Sterility restrictions in surgical settings make touch-less interaction an interesting solution for surgeons to interact directly with digital images. The HCI community has already explored several methods for touch-less interaction including those based on camera-based gesture tracking and voice control. In this paper, we present a system for gesture-based interaction with medical images based on a single wristband sensor and capacitive floor sensors, allowing for hand and foot gesture input. The first limited evaluation of the system showed an acceptable level of accuracy for 12 different hand & foot gestures; also users found that our combined hand and foot based gestures are intuitive for providing input.