Body-tracking camera control for demonstration videos

  • Authors:
  • Derrick Cheng;Pei-Yu Chi;Taeil Kwak;Björn Hartmann;Paul Wright

  • Affiliations:
  • University of California, Berkeley, Berkeley, California, USA;University of California, Berkeley, Berkeley, California, USA;University of California, Berkeley, Berkeley, California, USA;University of California, Berkeley, Berkeley, California, USA;University of California, Berkeley, Berkeley, California, USA

  • Venue:
  • CHI '13 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

A large community of users creates and shares how-to videos online. Many of these videos show demonstrations of physical tasks, such as fixing a machine, assembling furniture, or demonstrating dance steps. It is often difficult for the authors of these videos to control camera focus, view, and position while performing their tasks. To help authors produce videos, we introduce Kinectograph, a recording device that automatically pans and tilts to follow specific body parts, e.g., hands, of a user in a video. It utilizes a Kinect depth sensor to track skeletal data and adjusts the camera angle via a 2D pan-tilt gimbal mount. Users control and configure Kinectograph through a tablet application with real-time video preview. An informal user study suggests that users prefer to record and share videos with Kinectograph, as it enables authors to focus on performing their demonstration tasks.