Coming to grips with the objects we grasp: detecting interactions with efficient wrist-worn sensors
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Multisensor data fusion for high accuracy positioning on mobile phones
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
Egocentric visual event classification with location-based priors
ISVC'10 Proceedings of the 6th international conference on Advances in visual computing - Volume Part II
A tutorial on human activity recognition using body-worn inertial sensors
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
In long-term activity recognition, large sets of inertial sensor data need to be analyzed in which physical actions of the sensor’s wearer are captured non-stop for weeks to months. These massive time sequences often burden the processing, and especially any post-analysis of the data. We propose a method that approximates and matches accelerometer time series, that is fast on large data sets, well-suited to human acceleration data, and efficient to log on the sensors. Experiments show that approximation and matching are faster than traditional methods, while remaining competitive in recognition of motion patterns.