Gesture spotting using wrist worn microphone and 3-axis accelerometer
Proceedings of the 2005 joint conference on Smart objects and ambient intelligence: innovative context-aware services: usages and technologies
Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Recognition of dietary activity events using on-body sensors
Artificial Intelligence in Medicine
Gesture spotting with body-worn inertial sensors to detect user activities
Pattern Recognition
Gestures are strings: efficient online gesture spotting and classification using string matching
Proceedings of the ICST 2nd international conference on Body area networks
Towards dynamic and cooperative multi-device personal computing
The disappearing computer
Tool use as gesture: new challenges for maintenance and rehabilitation
BCS '10 Proceedings of the 24th BCS Interaction Specialist Group Conference
Ultrasound-based movement sensing, gesture-, and context-recognition
Proceedings of the 2013 International Symposium on Wearable Computers
Hi-index | 0.00 |
The paper demonstrates how ultrasonic hand tracking can be used to improve the performance of a wearable, accelerometer and gyroscope based activity recognition system. Specifically we target the recognition of manipulative gestures of the type found in assembly and maintenance tasks. We discuss how relevant information can be extracted from the ultrasonic signal despite problems with low sampling rate, occlusions and reflections that occur in this type of application. We then introduce several methods of fusing the ultrasound and motion sensor information. We evaluate our methods on an experimental data set that contains 21 different actions performed repeatedly by three different subjects during simulated bike repair. Due to the complexity of the recognition tasks with many similar and vaguely defined actions and person independent training both the ultrasound and motion sensors perform poorly on their own. However with our fusion methods recognition rates well over 90% can be achieved for most activities. In extreme case recognition rates go up from just over 50% for separate classifications to nearly 89% with our fusion methods.