Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Recognition of dietary activity events using on-body sensors
Artificial Intelligence in Medicine
Gesture spotting with body-worn inertial sensors to detect user activities
Pattern Recognition
Distributed Activity Recognition with Fuzzy-Enabled Wireless Sensor Networks
DCOSS '08 Proceedings of the 4th IEEE international conference on Distributed Computing in Sensor Systems
UAHCI'07 Proceedings of the 4th international conference on Universal access in human-computer interaction: ambient interaction
Recognition of user activity sequences using distributed event detection
EuroSSC'07 Proceedings of the 2nd European conference on Smart sensing and context
ERCIM'06 Proceedings of the 9th conference on User interfaces for all
CIRA'09 Proceedings of the 8th IEEE international conference on Computational intelligence in robotics and automation
Swallow sound analysis for automated ingestion detection
Proceedings of the 4th International Conference on PErvasive Technologies Related to Assistive Environments
Tool use as gesture: new challenges for maintenance and rehabilitation
BCS '10 Proceedings of the 24th BCS Interaction Specialist Group Conference
Detecting leisure activities with dense motif discovery
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication
A tutorial on human activity recognition using body-worn inertial sensors
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
We propose a two-stage recognition system for detecting arm gestures related to human meal intake. Information retrieved from such a system can be used for automatic dietary monitoring in the domain of behavioural medicine. We demonstrate that arm gestures can be clustered and detected using inertial sensors. To validate our method, experimental results including 384 gestures from two subjects are presented. Using isolated discrimination based on HMMs an accuracy of 94% can be achieved. When spotting the gestures in continous movement data, an accuracy of up to 87% is reached.