User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User-defined gestures for connecting mobile phones, public displays, and tabletops
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
Humanoid robot control using depth camera
Proceedings of the 6th international conference on Human-robot interaction
Real time interaction with mobile robots using hand gestures
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Studying user-defined iPad gestures for interaction in multi-display environment
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration
Hi-index | 0.00 |
This paper presents a study that allows users to define intuitive gestures to navigate a humanoid robot. For eleven navigational commands, 385 gestures, performed by 35 participants, were analyzed. The results of the study reveal user-defined gesture sets for both novice users and expert users. In addition, we present, a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, time performances of the gesture motions, and present implications to the design of the robot control, with a focus on recognition and user interfaces.