Metamouse: specifying graphical procedures by example
SIGGRAPH '89 Proceedings of the 16th annual conference on Computer graphics and interactive techniques
Multi-level direction of autonomous creatures for real-time virtual environments
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
GI '96 Proceedings of the conference on Graphics interface '96
Tangible bits: towards seamless interfaces between people, bits and atoms
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
curlybot: designing a new class of computational toys
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Where the action is: the foundations of embodied interaction
Where the action is: the foundations of embodied interaction
Layered acting for character animation
ACM SIGGRAPH 2003 Papers
Programming by example
Topobo: a constructive assembly system with kinetic memory
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
On tangible user interfaces, humans and spatiality
Personal and Ubiquitous Computing
Precomputing avatar behavior from human motion data
SCA '04 Proceedings of the 2004 ACM SIGGRAPH/Eurographics symposium on Computer animation
As-rigid-as-possible shape manipulation
ACM SIGGRAPH 2005 Papers
Natural person-following behavior for social robots
Proceedings of the ACM/IEEE international conference on Human-robot interaction
A humanoid robot that pretends to listen to route guidance from a human
Autonomous Robots
Teachable robots: Understanding human teaching behavior to build more effective robot learners
Artificial Intelligence
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
How close?: model of proximity control for information-presenting robots
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Human to robot demonstrations of routine home tasks: exploring the role of the robot's feedback
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
A hybrid algorithm for tracking and following people using a robotic dog
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Exploring the use of tangible user interfaces for human-robot interaction: a comparative study
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Touch and toys: new techniques for interaction with a remote group of robots
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A context-dependent attention system for a social robot
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Learning policies for embodied virtual agents through demonstration
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Puppet Master: designing reactive character behavior by demonstration
Proceedings of the 2008 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Style by demonstration for interactive robot motion
Proceedings of the Designing Interactive Systems Conference
Authoring rules for bodily interaction: from example clips to continuous motions
IVA'12 Proceedings of the 12th international conference on Intelligent Virtual Agents
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Design and evaluation techniques for authoring interactive and stylistic behaviors
ACM Transactions on Interactive Intelligent Systems (TiiS)
User-centered programming by demonstration: stylistic elements of behavior
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
The style in which a robot moves, expressed through its gait or locomotion, can convey effective messages to people. For example, a robot could move aggressively in reaction to a person's actions, or alternatively react using a set of careful, submissive movements. Designing, implementing and programming robotic interfaces that react to users' actions with properly styled movements can be a difficult, daunting, and time consuming technical task. On the other hand, most people can easily perform such stylistic tasks and movements, for example, through acting them out. Following this observation, we propose to enable people to use their existing teaching skills to directly demonstrate to robots, via in-situ acting, a desired style of interaction. In this paper we present an initial style-by-demonstration (SBD) proof-of-concept of our approach, allowing people to teach a robot specific, interactive locomotion styles by providing a demonstration. We present a broomstick-robot interface for directly demonstrating locomotion style to a collocated robot, and a design critique evaluation by experienced programmers that compares our SBD approach to traditional programming methods.