Developing Robot Motions by Simulated Touch Sensors
SIMPAR '08 Proceedings of the 1st International Conference on Simulation, Modeling, and Programming for Autonomous Robots
A new paradigm of humanoid robot motion programming based on touch interpretation
Robotics and Autonomous Systems
A survey of Tactile Human-Robot Interactions
Robotics and Autonomous Systems
Hi-index | 0.00 |
Gesture-based programming is a new paradigm to ease the burden of programming robots. By tapping in to the user's wealth of experience with contact transitions, compliance, uncertainty and operations sequencing, we hope to provide a more intuitive programming environment for complex, real-world tasks based on the expressiveness of nonverbal communication. A requirement for this to be accomplished is the ability to interpret gestures to infer the intentions behind them. As a first step toward this goal, this paper presents an application of distributed perception for inferring a user's intentions by observing tactile gestures. These gestures consist of sparse, inexact, physical "nudges" applied to the robot's end effector for the purpose of modifying its trajectory in free space. A set of independent agents-each with its own local, fuzzified, heuristic model of a particular trajectory parameter observes data from a wristforce/torque sensor to evaluate the gestures. The agents then independently determine the confidence of their respective findings and distributed arbitration resolves the interpretation through voting.