User learning and performance with marking menus
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A mark-based interaction paradigm for free-hand drawing
UIST '94 Proceedings of the 7th annual ACM symposium on User interface software and technology
Interactive sketching for the early stages of user interface design
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Path drawing for 3D walkthrough
Proceedings of the 11th annual ACM symposium on User interface software and technology
Teddy: a sketching interface for 3D freeform design
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
GestureLaser and GestureLaser Car: development of an embodied space to support remote instruction
Proceedings of the Sixth European conference on Computer supported cooperative work
CyberCode: designing augmented reality environments with visual tags
DARE '00 Proceedings of DARE 2000 on Designing augmented reality environments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Building a Multimodal Human-Robot Interface
IEEE Intelligent Systems
Interaction with a Projection Screen Using a Camera-tracked Laser Pointer
MMM '98 Proceedings of the 1998 Conference on MultiMedia Modeling
ARTag, a Fiducial Marker System Using Digital Techniques
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Using a hand-drawn sketch to control a team of robots
Autonomous Robots
A point-and-click interface for the real world: laser designation of objects for mobile manipulation
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Sketch and run: a stroke-based interface for home robots
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Blinkbot: look at, blink and move
UIST '10 Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technology
Proceedings of the 6th international conference on Human-robot interaction
ClippingLight: a method for easy snapshots with projection viewfinder and tilt-based zoom control
Proceedings of the 2nd Augmented Human International Conference
Roboshop: multi-layered sketching interface for robot housework assignment and management
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Pub - point upon body: exploring eyes-free interaction and methods on an arm
Proceedings of the 24th annual ACM symposium on User interface software and technology
Proceedings of the 10th asia pacific conference on Computer human interaction
PICOntrol: using a handheld projector for direct control of physical devices through visible light
Proceedings of the 25th annual ACM symposium on User interface software and technology
Efficient jitter compensation using double exponential smoothing
Information Sciences: an International Journal
exTouch: spatially-aware embodied manipulation of actuated objects mediated by augmented reality
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
Sublimate: state-changing virtual and physical rendering to augment interaction with shape displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Shape recognition of laser beam trace for human-robot interface
Pattern Recognition Letters
Gesture-based interaction with voice feedback for a tour-guide robot
Journal of Visual Communication and Image Representation
Hi-index | 0.00 |
A laser pointer can be a powerful tool for robot control. However, in the past, their use in the field of robotics has been limited to simple target designation, without exploring their potential as versatile input devices. This paper proposes to create a laser pointer-based user interface for giving various instructions to a robot by applying stroke gesture recognition to the laser's trajectory. Through this interface, the user can draw stroke gestures using a laser pointer to specify target objects and commands for the robot to execute accordingly. This system, which includes lasso and dwelling gestures for object selection, stroke gestures for robot operation, and push-button commands for movement cancellation, has been refined from its prototype form through several user-study evaluations. Our results suggest that laser pointers can be effective not only for target designation but also for specifying command and target location for a robot to perform.