Practical planning: extending the classical AI planning paradigm
Practical planning: extending the classical AI planning paradigm
Explaining and repairing plans that fail
Artificial Intelligence
Telerobotics, automation, and human supervisory control
Telerobotics, automation, and human supervisory control
Integrating human-computer interaction with planning for a telerobotic system
Integrating human-computer interaction with planning for a telerobotic system
A point-and-click interface for the real world: laser designation of objects for mobile manipulation
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Hi-index | 0.00 |
The Multimodal User Supervised Interface and Intelligent Control (MUSIIC) project focuses on a multimodal human-machine interface which addresses user need to manipulate familiar objects in an unstructured environment. The control of a robot by individuals with significant physical limitations presents a challenging problem of telemanipulation. This is addressed by a unique user-interface integrating the user's command (speech) and gestures (pointing) with autonomous planning techniques (knowledge-bases and 3-D vision). The resultant test-bed offers the opportunity to study telemanipulation by individuals with physical disabilities, and can be generalized to an effective technique for other, including remote and time-delayed, telemanipulation. This paper focuses on the knowledge-driven planning mechanism that is central to the MUSIIC system.