Structure learning on large scale common sense statistical models of human state

  • Authors:
  • William Pentney;Matthai Philipose;Jeff Bilmes

  • Affiliations:
  • Department of Computer Science & Engineering, University of Washington;Intel Research Seattle;Department of Electrical Engineering, University of Washington

  • Venue:
  • AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 3
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Research has shown promise in the design of large scale common sense probabilistic models to infer human state from environmental sensor data. These models have made use of mined and preexisting common sense data and traditional probabilistic machine learning techniques to improve recognition of the state of everyday human life. In this paper, we demonstrate effective techniques for structure learning on graphical models designed for this domain, improving the SRCS system of (Pentney et al. 2006) by learning additional dependencies between variables. Because the models used for common sense reasoning typically involve a large number of variables, issues of scale arise in searching for additional dependencies; we discuss how we use data mining techniques to address this problem. We show experimentally that these techniques improve the accuracy of state prediction, and that, with a good prior model, the use of a common sense model with structure learning provides better prediction of unlabeled variables as well as labeled variables. The results also demonstrate that it is possible to collect new common sense information about daily life using such a statistical model and labeled data.