Non-manual cues in automatic sign language recognition
Proceedings of the 4th International Conference on PErvasive Technologies Related to Assistive Environments
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Non-manual cues in automatic sign language recognition
Personal and Ubiquitous Computing
Hi-index | 0.00 |
When users of computer systems are given the op- portunity to provide feedback on their preferences or interests in most cases they are presented with a ques- tionnaire to be filled. However, this means of interac- tion requires an additional cognitive step, since the user is required to rationalize feelings and attitudes by answering predetermined questions; as a result, users often skip this part or answer hastily, depriving sys- tems of valuable criticism `from the field'. In this pa- per, we present a system which relates data extracted from the posture and movement of the user's head and eye gaze with states related to interest and engage- ment. This system is deployed in the context of Human Computer Interaction, where users read documents from a computer screen and provides non-verbal input which, in turn, can be further processed.