SmartSkin: an infrastructure for freehand manipulation on interactive surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Collaborative Mixed Reality Visualization of an Archaeological Excavation
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
Gesture Registration, Relaxation, and Reuse for Multi-Point Direct-Touch Surfaces
TABLETOP '06 Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems
A study of hand shape use in tabletop gesture interaction
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the working conference on Advanced visual interfaces
Hands-on the process control: users preferences and associations on hand movements
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Improving interaction with virtual globes through spatial thinking: helping users ask "why?"
Proceedings of the 13th international conference on Intelligent user interfaces
Dynamic positioning systems: usability and interaction styles
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
Multi-touch interaction for robot control
Proceedings of the 14th international conference on Intelligent user interfaces
Sketch and run: a stroke-based interface for home robots
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multi-touch interface for controlling multiple mobile robots
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Eunomia: toward a framework for multi-touch information displays in public spaces
BCS-HCI '08 Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 2
Understanding users' preferences for surface gestures
Proceedings of Graphics Interface 2010
Diagram editing on interactive displays using multi-touch and pen gestures
Diagrams'10 Proceedings of the 6th international conference on Diagrammatic representation and inference
Design and validation of two-handed multi-touch tabletop controllers for robot teleoperation
Proceedings of the 16th international conference on Intelligent user interfaces
Roboshop: multi-layered sketching interface for robot housework assignment and management
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
How users associate wireless devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tangible bots: interaction with active tangibles in tabletop interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
2D similarity transformations on multi-touch surfaces
Proceedings of Graphics Interface 2011
"Oh snap" - helping users align digital objects on touch interfaces
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part III
RoboTable2: a novel programming environment using physical robots on a tabletop platform
Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology
Hand occlusion on a multi-touch tabletop
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimedia Tools and Applications
Establishing a baseline for text entry for a multi-touch virtual keyboard
International Journal of Human-Computer Studies
Assessing the effectiveness of direct gesture interaction for a safety critical maritime application
International Journal of Human-Computer Studies
Web on the wall: insights from a multimodal interaction elicitation study
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Gestures for industry: intuitive human-robot communication from human observation
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Towards user-defined multi-touch gestures for 3D objects
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
Context-based hand gesture recognition for the operating room
Pattern Recognition Letters
Hi-index | 0.00 |
Multi-touch technologies hold much promise for the command and control of mobile robot teams. To improve the ease of learning and usability of these interfaces, we conducted an experiment to determine the gestures that people would naturally use, rather than the gestures they would be instructed to use in a pre-designed system. A set of 26 tasks with differing control needs were presented sequentially on a DiamondTouch to 31 participants. We found that the task of controlling robots exposed unique gesture sets and considerations not previously observed, particularly in desktop-like applications. In this paper, we present the details of these findings, a taxonomy of the gesture set, and guidelines for designing gesture sets for robot control.