A tutorial on hidden Markov models and selected applications in speech recognition
Readings in speech recognition
Integrating active perception with an autonomous robot architecture
AGENTS '98 Proceedings of the second international conference on Autonomous agents
Perseus: an extensible vision system for human-machine interaction
Perseus: an extensible vision system for human-machine interaction
The BATmobile: towards a Bayesian automated taxi
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Recognizing and interpreting gestures on a mobile robot
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
Hi-index | 0.00 |
The automatic recognition of gestures is a well known area, but on the other hand, the usability of these type of human-machine interfaces has not been deeply taking into account. This paper shows the results of evaluation of a man-machine interface, from two points of view: i) automatic recognition and; ii) usability from the point of view of the end-users. We choose 5 gestures focus to control a mobile robot. With 8 end-user, the interface gives a 59.46% of good results when using Dynamic Bayesian Networks, a better result than using Hidden Markov Models that obtained an average of 43.03%. This result contrasts with the 84.01% with only one end-user, showing the difficulties to extend the number of end-users to the system. The usability of the interface was tested using 6 of the 8 end-users, divided into 3 categories: expert, semi-expert and non-expert end-users, the result shows a closer perceptions between expert and not-expert end-users.