Recognizing multi-user activities using wearable sensors in a smart home
Pervasive and Mobile Computing
Energy efficient activity recognition based on low resolution accelerometer in smart phones
GPC'12 Proceedings of the 7th international conference on Advances in Grid and Pervasive Computing
Pervasive and Mobile Computing
A knowledge-driven approach to composite activity recognition in smart environments
UCAmI'12 Proceedings of the 6th international conference on Ubiquitous Computing and Ambient Intelligence
Mining order-preserving submatrices from probabilistic matrices
ACM Transactions on Database Systems (TODS)
Associative Classification based Human Activity Recognition and Fall Detection using Accelerometer
International Journal of Intelligent Information Technologies
Constructing the Web of Events from raw data in the Web of Things
Mobile Information Systems - Internet of Things
Journal of Ambient Intelligence and Smart Environments - Design and Deployment of Intelligent Environments
Hi-index | 0.00 |
Recognizing human activities from sensor readings has recently attracted much research interest in pervasive computing due to its potential in many applications, such as assistive living and healthcare. This task is particularly challenging because human activities are often performed in not only a simple (i.e., sequential), but also a complex (i.e., interleaved or concurrent) manner in real life. Little work has been done in addressing complex issues in such a situation. The existing models of interleaved and concurrent activities are typically learning-based. Such models lack of flexibility in real life because activities can be interleaved and performed concurrently in many different ways. In this paper, we propose a novel pattern mining approach to recognize sequential, interleaved, and concurrent activities in a unified framework. We exploit Emerging Pattern—a discriminative pattern that describes significant changes between classes of data—to identify sensor features for classifying activities. Different from existing learning-based approaches which require different training data sets for building activity models, our activity models are built upon the sequential activity trace only and can be applied to recognize both simple and complex activities. We conduct our empirical studies by collecting real-world traces, evaluating the performance of our algorithm, and comparing our algorithm with static and temporal models. Our results demonstrate that, with a time slice of 15 seconds, we achieve an accuracy of 90.96 percent for sequential activity, 88.1 percent for interleaved activity, and 82.53 percent for concurrent activity.