Complex activity recognition using context driven activity theory in home environments
NEW2AN'11/ruSMART'11 Proceedings of the 11th international conference and 4th international conference on Smart spaces and next generation wired/wireless networking
A middleware for pervasive situation-awareness
DAIS'12 Proceedings of the 12th IFIP WG 6.1 international conference on Distributed Applications and Interoperable Systems
An integrated framework for human activity classification
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
An integrated framework for human activity recognition
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Complex activity recognition using context-driven activity theory and activity signatures
ACM Transactions on Computer-Human Interaction (TOCHI)
Hi-index | 0.00 |
A major challenge in pervasive computing is to develop systems that can reliably recognize human activity patterns, such as bathing from sensor data. Typical sensor deployments generate sparse datasets with thousands of sensor readings and few instances of activities. The imbalance between the number of features (i.e. sensors firing) and the classification targets (i.e. activities) complicates the learning process. In this paper, we propose a novel framework for discovering relationships between sensor signals and observed human activities from sparse datasets. The framework builds on the use of Bayesian networks for modeling activities by representing statistical dependencies between sensors. This allows us to solve two key problems: firstly, how to automatically determine an effective structure for a Bayesian network that recognizes a particular activity without human intervention; and, secondly, we address the pragmatic problem of sparse training data, where the data available to train the activity recognizers is limited. In our approach, we "learn' the structure of the Bayesian networks automatically from the sensor data. We optimize this process in 3 ways: firstly, we perform multicollinearity analysis to focus on orthogonal sensor data with minimal redundancy. Secondly, we propose Efron's bootstrapping to generate large training sets that capture important features of an activity. Finally, we find the best Bayesian network that explains our data using a heuristic search that is unbiased to the ordering between consecutive variables. We evaluate our approach using a data set gathered from MIT's Place- Lab. The inferred networks correctly identify activities for 85% of the time.