Phase recognition during surgical procedures using embedded and body-worn sensors

  • Authors:
  • Jakob E. Bardram;Afsaneh Doryab;Rune M. Jensen;Poul M. Lange;Kristian L. G. Nielsen;Soren T. Petersen

  • Affiliations:
  • IT University of Copenhagen, Rued Langgaards Vej 7, DK-2300, Denmark;IT University of Copenhagen, Rued Langgaards Vej 7, DK-2300, Denmark;IT University of Copenhagen, Rued Langgaards Vej 7, DK-2300, Denmark;IT University of Copenhagen, Rued Langgaards Vej 7, DK-2300, Denmark;IT University of Copenhagen, Rued Langgaards Vej 7, DK-2300, Denmark;IT University of Copenhagen, Rued Langgaards Vej 7, DK-2300, Denmark

  • Venue:
  • PERCOM '11 Proceedings of the 2011 IEEE International Conference on Pervasive Computing and Communications
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In Ubiquitous Computing (Ubicomp) research, substantial work has been directed towards sensor-based detection and recognition of human activity. This research has, however, mainly been focused on activities of daily living of a single person. This paper presents a sensor platform and a machine learning approach to sense and detect phases of a surgical operation. Automatic detection of the progress of work inside an operating room has several important applications, including coordination, patient safety, and context-aware information retrieval. We verify the platform during a surgical simulation. Recognition of the main phases of an operation was done with a high degree of accuracy. Through further analysis, we were able to reveal which sensors provide the most significant input. This can be used in subsequent design of systems for use during real surgeries.