CHI '86 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Building a Multimodal Human-Robot Interface
IEEE Intelligent Systems
Territoriality in collaborative tabletop workspaces
CSCW '04 Proceedings of the 2004 ACM conference on Computer supported cooperative work
Multi-robot Simultaneous Localization and Mapping using Particle Filters
International Journal of Robotics Research
Evolution and field performance of a rescue robot: Field Reports
Journal of Field Robotics - Special Issue on Search and Rescue Robots
Designing Gestural Interfaces: Touchscreens and Interactive Devices
Designing Gestural Interfaces: Touchscreens and Interactive Devices
Multi-touch interaction for robot control
Proceedings of the 14th international conference on Intelligent user interfaces
Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Multi-touch interaction for robot command and control
Multi-touch interaction for robot command and control
Design of unimanual multi-finger pie menu interaction
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Multi-user multi-touch multi-robot command and control of multiple simulated robots
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Hi-index | 0.00 |
Controlling the movements of mobile robots, including driving the robot through the world and panning the robot's cameras, typically requires many physical joysticks, buttons, and switches. Operators will often employ a technique called "chording" to cope with this situation. Much like a piano player, the operator will simultaneously actuate multiple joysticks and switches with his or her hands to create a combination of complimentary movements. However, these controls are in fixed locations and unable to be reprogrammed easily. Using a Microsoft Surface multi-touch table, we have designed an interface that allows chording and simultaneous multi-handed interaction anywhere that the user wishes to place his or her hands. Taking inspiration from the biomechanics of the human hand, we have created a dynamically resizing, ergonomic, and multi-touch controller (the DREAM Controller). This paper presents the design and testing of this controller with an iRobot ATRV-JR robot.