Specifying gestures by example
Proceedings of the 18th annual conference on Computer graphics and interactive techniques
Proceedings of the 11th annual ACM symposium on User interface software and technology
Quikwriting: continuous stylus-based text entry
Proceedings of the 11th annual ACM symposium on User interface software and technology
Teddy: a sketching interface for 3D freeform design
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
Fluid sketches: continuous recognition and morphing of simple hand-drawn shapes
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Effective user interface design for rescue robotics
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
A field experiment of autonomous mobility: operator workload for one and two robots
Proceedings of the ACM/IEEE international conference on Human-robot interaction
Using a hand-drawn sketch to control a team of robots
Autonomous Robots
Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes
Proceedings of the 20th annual ACM symposium on User interface software and technology
A point-and-click interface for the real world: laser designation of objects for mobile manipulation
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
A semi-autonomous communication robot: a field trial at a train station
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Simultaneous teleoperation of multiple social robots
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
CRISTAL, control of remotely interfaced systems using touch-based actions in living spaces
ACM SIGGRAPH 2009 Emerging Technologies
Designing Laser Gesture Interface for Robot Control
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
CRISTAL: a collaborative home media and device controller based on a multi-touch display
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Interactive control of humanoid navigation
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
Multimodal interaction with an autonomous forklift
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Touch projector: mobile interaction through video
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Cooking with robots: designing a household system working in open environments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
WallBots: interactive wall-crawling robots in the hands of public artists and political activists
Proceedings of the 8th ACM Conference on Designing Interactive Systems
Life-sketch: a framework for sketch-based modelling and animation of 3D objects
AUIC '10 Proceedings of the Eleventh Australasian Conference on User Interface - Volume 106
Exploring sketching for robot collaboration
Proceedings of the 6th international conference on Human-robot interaction
Roboshop: multi-layered sketching interface for robot housework assignment and management
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 10th asia pacific conference on Computer human interaction
AUIC '11 Proceedings of the Twelfth Australasian User Interface Conference - Volume 117
Hi-index | 0.01 |
Numerous robots have been developed, and some of them are already being used in homes, institutions, and workplaces. Despite the development of useful robot functions, the focus so far has not been on user interfaces of robots. General users of robots find it hard to understand what the robots are doing and what kind of work they can do. This paper presents an interface for the commanding home robots by using stroke gestures on a computer screen. This interface allows the user to control robots and design their behaviors by sketching the robot's behaviors and actions on a top-down view from ceiling cameras. To convey a feeling of directly controlling the robots, our interface employs the live camera view. In this study, we focused on a house-cleaning task that is typical of home robots, and developed a sketch interface for designing behaviors of vacuuming robots.