Episode detection in videos captured using a head-mounted camera

  • Authors:
  • Aneesh Chauhan;Sameer Singh;Dave Grosvenor

  • Affiliations:
  • University of Exeter, Autonomous Technologies Research, Department of Computer Science, EX4 4QF, Exeter, UK;University of Exeter, Autonomous Technologies Research, Department of Computer Science, EX4 4QF, Exeter, UK;Hewlett Packard Research Labs, Digital Media Department, Frenchay, EX4 4QF, Bristol, UK

  • Venue:
  • Pattern Analysis & Applications
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the advent of wearable computing, personal imaging, photojournalism and personal video diaries, the need for automated archiving of the videos captured by them has become quite pressing. The principal device used to capture the human-environment interaction with these devices is a wearable camera (usually a head-mounted camera). The videos obtained from such a camera are raw and unedited versions of the visual interaction of the wearer (the user of the camera) with the surroundings. The focus of our research is to develop post-processing techniques that can automatically abstract videos based on episode detection. An episode is defined as a part of the video that was captured when the user was interested in an external event and paid attention to record it. Our research is based on the assumption that head movements have distinguishable patterns during an episode occurrence and these patterns can be exploited to differentiate between an episode and a non-episode. Here we present a novel algorithm exploiting the head and body behaviour for detecting the episodes. The algorithm’s performance is measured by comparing the ground truth (user-declared episodes) with the detected episodes. The experiments show the high degree of success we achieved with our proposed method on several hours of head-mounted video captured in varying locations.