Robotic grasping of unknown objects: a knowledge-based approach
International Journal of Robotics Research
Achieving dextrous grasping by integrating planning and vision-based sensing
International Journal of Robotics Research - Special issue on integration among planning, sensing, and control
Robot Motion Planning
Care-O-bot II—Development of a Next Generation Robotic Home Assistant
Autonomous Robots
International Journal of Robotics Research
Robotic Grasping of Novel Objects using Vision
International Journal of Robotics Research
A point-and-click interface for the real world: laser designation of objects for mobile manipulation
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Probabilistic mobile manipulation in dynamic environments, with application to opening doors
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Vision-tactile-force integration and robot physical interaction
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
Manipulation primitives: A paradigm for abstraction and execution of grasping and manipulation tasks
Robotics and Autonomous Systems
Hi-index | 0.00 |
Although the grasp-task interplay in our daily life is unquestionable, very little research has addressed this problem in robotics. In order to fill the gap between the grasp and the task, we adopt the most successful approaches to grasp and task specification, and extend them with additional elements that allow to define a grasp-task link. We propose a global sensor-based framework for the specification and robust control of physical interaction tasks, where the grasp and the task are jointly considered on the basis of the task frame formalism and the knowledge-based approach to grasping. A physical interaction task planner is also presented, based on the new concept of task-oriented hand preshapes. The planner focuses on manipulation of articulated parts in home environments, and is able to specify automatically all the elements of a physical interaction task required by the proposed framework. Finally, several applications are described, showing the versatility of the proposed approach, and its suitability for the fast implementation of robust physical interaction tasks in very different robotic systems.