Modeling human behavior from simple sensors in the home

  • Authors:
  • Ryan Aipperspach;Elliot Cohen;John Canny

  • Affiliations:
  • Berkeley Institute of Design, University of California, Berkeley, CA;Berkeley Institute of Design, University of California, Berkeley, CA;Berkeley Institute of Design, University of California, Berkeley, CA

  • Venue:
  • PERVASIVE'06 Proceedings of the 4th international conference on Pervasive Computing
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Pervasive sensors in the home have a variety of applications including energy minimization, activity monitoring for elders, and tutors for household tasks such as cooking. Many of the common sensors today are binary, e.g. IR motion sensors, door close sensors, and floor pressure pads. Predicting user behavior is one of the key enablers for applications. While we consider smart home data here, the general problem is one of predicting discrete human actions. Drawing on Activity Theory, the language as action principle, and speech understanding research, we argue that smoothed n-grams are very appropriate for this task. We built such a model and applied it to data gathered from 3 smart home installations. The data showed a classic Zipf or power-law distribution, similar to speech and language. We found that the predictive accuracy of the n-gram model ranges from 51% to 39%, which is significantly above the baseline for the deployments of 16, 76 and 70 sensors. While we cannot directly compare this result with other work (lack of shared data), by examination of high entropy zones in the datasets (e.g. the kitchen triangle) we argue that accuracies around 50% are best possible for this task.