The GOMS family of user interface analysis techniques: comparison and contrast
ACM Transactions on Computer-Human Interaction (TOCHI)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Gesture spotting with body-worn inertial sensors to detect user activities
Pattern Recognition
Toward accurate dynamic time warping in linear time and space
Intelligent Data Analysis
Toward natural interaction in the real world: real-time gesture recognition
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Multimodal user interfaces in IPS2
ICDHM'11 Proceedings of the Third international conference on Digital human modeling
Specifying ACT-R models of user interaction with a GOMS language
Cognitive Systems Research
Hi-index | 0.00 |
In the SFB/TR29 a focus lies on human factors and their integration into Industrial Product-Service Systems (IPS2). These innovative systems are complex and dynamic. Human operators need to be able to perform a multitude of complex tasks in such socio-technical systems, providing a challenge to the operators because of the high complexity. Therefore automatic assistance systems are necessary for the overall reliability and effectiveness of such a system. This article describes a theoretical approach for simulating human behavior with cognitive models. The performed actions are recognized with motion capturing in combination with machine learning. By evaluating the perceived action and reality a description for the situation can be automatically generated in real time. This can be used for e.g. providing the human operator with real time contextual feedback.